DRI-186 for week of 5-10-15: How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

An Access Advertising EconBrief:

How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

The previous EconBrief explains how the classical theory of voluntary exchange and the moral concept of individual responsibility mutually reinforce each other. The mutually beneficial character of voluntary exchange allows individuals to assume responsibility for their own actions in a free society. Individual responsibility permits voluntary exchange to function without the necessity of, say, review of each transaction by a neutral third party to insure fairness. The role of government in a voluntary society is minimal – to enforce contracts and prevent coercion.

Recently, the issue of responsibility for war crimes committed during World War II has been raised by various independent events. In Germany, a 93-year-old man is standing trial as an accessory to war crimes committed while he worked at the Auschwitz concentration camp during World War II. His presence in the camp is known, but his actual role and behavior is disputed. Should the prosecution have to prove he actually committed crimes, or would his participation as (say) a guard be enough to warrant his conviction as a war criminal?

A recent column in The Wall Street Journal by Bret Stephens (“From Buchenwald to Europe,” 05/05/2015) observes that many people in Germany were victims of Nazism, not Nazis – including many non-Jews. How should this affect Germany’s national policies today on European union, immigration and attitude toward systematic anti-Semitism and misogyny practiced by Muslim immigrants? “It isn’t easy, or ultimately wise, [for Germany] to live life in a state of perpetual atonement,” Mr. Stephens thinks.

Japan’s Prime Minister Shinzo Abe has publicly marveled about the transformation in relations between Japan and America, two countries who became deadly rivals in the late 1930s and waged total war in the 1940s, culminating in mankind’s only nuclear attack. Today we are two of the planet’s closest trading partners. Abe clearly wants to enlist the cooperation of the U.S. in Japan’s efforts to re-arm against the imminent threat of mainland’s China’s sabre-rattling territorial ambitions. But Abe has also made disturbing noises in domestic politics, worshipping at the shrine of Japan’s war dead and speaking equivocally about Japan’s aggressive invasion of its Asian neighbors in the 1930s. These speeches are a rough Japanese analogue to holocaust-denial.

In deciding what to make of these events, our analytical anchor is once again the economic logic of individual responsibility arising in a context of voluntary exchange.

The Flawed Notion of National Responsibility for War Crimes

In his Wall Street Journal piece, Bret Stephens depicts “the drama of postwar Germany” as its “effort to bury the Nazi corpse,” which “haunts Germany at every turn.” This phrasing is troubling. It implies that Germany’s residents bear a collective burden for sins committed long before most of them were even born.

Not surprisingly, this burden hasn’t just been heavy – it has been unshakeable. “Should Germany’s wartime sins be expiated by subsidizing the spendthrift habits of corrupt Greek governments? Should fear of being accused of xenophobia require Germans to turn a blind eye to Jew-hatred and violent misogyny when the source if Germany’s Muslim minority?” These questions, posed rhetorically by Mr. Stephens, should be placed in the pantheon of pointlessness with queries about the angel-carrying capacity of pinheads.

Even before World War II ended, many people realized that the Axis powers would have to be called to account for their sins. Members of the German and Japanese governments and military had committed acts that mined new depths of depravity. Civilization had institutions and standards for judging and punishing the familiar forms of crime, but the scope and magnitude of Axis atrocities persuaded the Allies to hold separate war-crimes tribunals for Germany and Japan. And the defendants at every trial were individual human beings, not collective entities called “Germany” or “Japan.”

To be sure, there were arguments – some of them almost as bitter as the fighting that preceded the trials – about which individuals should be tried. At least some of the disagreement probably reflected disappointment that the most deserving defendants (Hitler, Goering et al) had cheated the hangman by committing suicide beforehand. But nobody ever entertained the possibility of putting either nation on trial. In the first place, it would have been a practical impossibility. And without an actual trial, the proceedings would have been a travesty of justice. Even beyond that, though, the greater travesty would have been to suggest that the entirety of either nation had been at fault for acts such as the murder of millions of Jews by the Nazis.

We need look no farther than Stephens’ own article to substantiate this. He relates the story of his father-in-law, Hermann, who celebrated his 11th birthday on VE-Day, May 8th, 1945. He was the namesake of his father, a doctor who died in a German prison camp, where he was imprisoned for the crime of xenophilia, showing friendly feelings to foreign workers. Father Hermann apparently treated inhabitants of forced-labor camps and was indicating the likelihood of an ultimate Russian victory over Germany. Not only was he not committing atrocities, he was trying to compensate for their effects and got killed for his pains. Were we supposed to prosecute his 11-year old son? What madness that would have been! As Stephens put it, “what was a 10-ywar-old boy, whose father had died at Nazi hands, supposed to atone for?”

History tells us that Germany also harbored its own resistance movement, which worked behind the scenes to oppose Fascism in general and the war in particular. In fact, the Academy Award for Best Actor in 1943 went not to Humphrey Bogart, star of Best Picture winner Casablanca, but instead to Paul Lukas, who played a German who risked his life fighting the Nazis in the movie Watch On the Rhine. The Freiburg School of economists, a German free-market school of economists formed before the war, openly opposed Fascist economic policies even during World War II. Their prestige was such that the Nazis did not dare kill them, instead preferring to suppress their views and prevent their professional advancement. Then there were the sizable number of Germans who did not join the Nazi Party and were not politically active.

Hold every contemporary German criminally accountable for the actions of Hitler, Goebbels, Hess, Goering, Mengele and the rest? Unthinkable. In which case, how can we even contemplate asking today’s Germans, who had no part in the war crimes, weren’t even alive when they were committed and couldn’t have prevented them even if inclined to try, to “atone” for them?

The longer we think about the notion of contemporary national guilt for war crimes, the more we wonder how such a crazy idea ever wandered into our heads in the first place. Actually, we shouldn’t wonder too long about that. The notion of national, or collective, guilt came from the same source as most of the crazy ideas extant.

It came from the intellectual left wing.

The Origin of “Social Wholes”

There is no more painstaking and difficult pastime than tracing the intellectual pedigree of ideas. Apparently, the modern concept of the “social whole” or national collective seems traceable to the French philosopher, Claude Henri de Rouvroy, Comte de Saint Simon (hereinafter Saint-Simon). Saint-Simon is rightfully considered the father of Utopian Socialism. Born an aristocrat in 1760, he lived three lives – the first as a French soldier who fought for America in the Revolution, the second as a financial speculator who made and lost several fortunes, the third as an intellectual dilettante whose personal writings attracted the attention of young intellectuals and made him the focus of a cult.

Around age 40, Saint-Simon decided to focus his energies on intellectual pursuits. He was influenced by the intellectual ferment within France’s Ecole polytechnique, where the sciences of mathematics, chemistry, physics and physiology turned out distinguished specialists such as Lavoisier, Lagrange and Laplace. Unfortunately, Saint-Simon himself was able to appreciate genius but not to emulate it. Even worse, he was unable to grasp any distinction between the natural sciences and social sciences such as economics. In 1803, he wrote a pamphlet in which he proposed to attract funds by subscription for a “Council of Newton,” composed of twenty of the world’s most distinguished men of science, to be elected by the subscribers. They would be deemed “the representatives of God on earth,” thus displacing the Pope and other divinely ordained religious authorities, but with additional powers to direct the secular affairs of the world. According to Saint-Simon, these men deserved this authority because their competence in science would enable them to consciously order human affairs more satisfactorily than heretofore. Saint-Simon had received this plan in a revelation from God.

“All men will work; they will regard themselves as laborers attached to one workshop whose efforts will be directed to guide human intelligence according to my divine foresight [emphasis added]. The Supreme Council of Newton will direct their works… Anybody who does not obey their orders will be treated … as a quadruped.” Here we have the beginnings of the collective concept: all workers work for a single factory, under one central administration and one boss.

We can draw a direct line between this 1803 publication of Saint-Simon and the 20th century left-wing “Soviet of engineers” proposed by institutional economist Thorstein Veblen, the techno-socialism of J. K. Galbraith and the “keep the machines running” philosophy of Clarence Ayres. “Put government in the hands of technical specialists and give them absolute authority” has been the rallying cry of the progressive left wing since the 19th century.

Saint-Simon cultivated a salon of devotees who propagated his ideas after his death in 1825. These included most notably Auguste Comte, the founder of the “science” of sociology, which purports to aggregate all the sciences into one collective science of humanity. Comte inherited Saint-Simon’s disregard for individual liberty, referring contemptuously to “the anti-social dogma of the ‘liberty of individual conscience.'” It is no coincidence that socialism, which had its beginnings with Saint-Simon and his salon, eventually morphed into Nazism, which destroyed individual conscience so completely as to produce the Holocaust. That transformation from socialism to Nazism was described by Nobel laureate F. A. Hayek in The Road to Serfdom.

Today, the political left is committed to the concept of the collective. Its political constituencies are conceived in collective form: “blacks,” “women,” “labor,” “farmers,” “the poor.” Each of these blocs is represented by an attribute that blots out all trace of individuality: skin color, gender, economic class (or occupation), income. The collective concept implies automatic allegiance, unthinking solidarity. This is convenient for political purposes, since any pause for thought before voting might expose the uncomfortable truth that the left has no coherent policy program or set of ideas. The left traffics exclusively in generalities that attach themselves to social wholes like pilot fish to sharks: “the 1%,” the 99%,” “Wall St. vs. Main St.,” “people, not profit,” “the good of the country as a whole.” This is the parlor language of socialism. The left finds it vastly preferable to nitty-gritty discussion of the reality of socialism, which is so grim that it couldn’t even be broached on college campuses without first issuing trigger warnings to sensitive students.

The left-wing rhetoric of the collective has special relevance to the question of war crimes. Actual war crimes are committed by individual human beings. Human beings live discrete, finite lives. But a collective is not bound by such limitations. For example, consider the business concept of a corporation. Every single human being whose efforts comprise the workings of the corporation will eventually die, but the corporation itself is – in principle – eternal. Thus, it is a collective entity that corresponds to left-wing notions because it acts as if animated by a single will and purpose. And the left constantly laments the obvious fact that the U.S. does not and cannot act with this singular unanimity of purpose. For decades, left-wing intellectuals such as Arthur Schlesinger and John Kenneth Galbraith have looked back with nostalgia at World War II because the U.S. united around the single goal of winning the war and subordinated all other considerations to it.

The Rhetorical Convenience of Collective Guilt

Given its collective bent, we would expect to find the left in the forefront of the “collective guilt” school of thought on the issue of war crimes. And we do. For the left, “the country” is one single organic unity that never dies. When “it” makes a ghastly error, “it” bears the responsibility and guilt until “it” does something to expiate the sin. That explains why Americans have been figuratively horsewhipped for generations about the “national shame” and “original sin” of slavery. It is now 153 years after the Emancipation Proclamation and 150 years since the end of the Civil War, when a half-million Americans died to prevent slaveholding states from seceding from the Union. The U.S. Constitution was amended specifically to grant black Americans rights previously denied them following the Civil War. Yet “we” – that is, collective entity of “the country” on which left-wing logic rests – have not yet expunged this legacy of slavery from “our” moral rap sheet. Exactly how the slate should be wiped clean is never clearly outlined – if it were, then the left wing would lose its rhetorical half-Nelson on the public debate over race – but each succeeding generation must carry this burden on its shoulders in a race-reversed reprise of the song “Old Man River” from the play Showboat. “Tote that barge, lift that bale” refers in this case not to cotton but to the moral burden of being responsible for things that happened a century or more before our birth.

If this burden can be made heavy enough, it can motivate support for legislation like forced school busing, affirmative action and even racial reparations. Thus, the collective concept is a potentially powerful one. As Bret Stephens observes, it is now being pressed into service to prod Germany into bailing out Greeks, whose status as international deadbeats is proverbial. Exactly how were Greeks victimized by Germans? Were they somehow uniquely tyrannized by the Nazis – more so than, say, the Jews who later emigrated to Israel? No, Germany’s Nazism of seventy or eighty years ago is merely a handy pig bladder with which to beat today’s German over the head to extract blackmail money for the latest left-wing cause du jour. Since the money must come from the German government, German taxpayers must fork it over. A justification must be found for blackmailing German taxpayers. The concept of collective guilt is the ideal lever for separating Germans from their cash. Every single German is part of the collective; therefore, every single German is guilty. Voila!

The Falsity of Social Wholes

In The Counterrevolution of Science (1952), Nobel laureate F.A. Hayek meticulously traced the pedigree of social wholes back to their roots. He sketched the life and intellectual career of Saint Simon and his disciple Auguste Comte. Hayek then carefully exposed the fallacies behind the holistic method and explained why the unit of analysis in the social sciences must be the individual human being.

Holistic concepts like “the country” are abstract concepts that have no concrete referent because they are not part of the data of experience for any individual. Nobody ever interacts directly with “the country,” nor does “the country” ever interact directly with any other “country.” The only meaning possible for “the country” is the sum of all the individual human beings that comprise it, and the only possible theoretical validity for social wholes generally arises when they are legitimately constructed from their individual component parts. Indeed, Hayek views one role for social scientists as the application of this “compositive” method of partial aggregation as a means of deriving theories of human interaction.

The starting point, though, must be the individual – and theory can proceed only as far as individual plans and actions can be summed to produce valid aggregates. The left-wing historical modus operandi has reversed this procedure, beginning with one or more postulated wholes and deriving results, sometimes drawing conclusions about individual behavior but more often subsuming individuals completely within a faceless mass.

An example may serve to clarify the difference in the two approaches. The individualist approach, common to classical and neoclassical economics, is at home with the multifarious differences in gender, race, income, taste, preferences, culture and historical background that typify the human race. There is only one assumed common denominator among people – they act purposefully to achieve their ends. (For purposes of simplicity, those ends are termed “happiness.”)Then economic theory proceeds to show how the price system tends to coordinate the plans and behavior of people despite the innumerable differences that otherwise characterize them.

In contrast, the aggregative or holistic theory begins with certain arbitrarily chosen aggregates – such as “blacks.” It assumes that skin color is the defining characteristic of members of this aggregate; that is, skin color determines both the actions of the people within the aggregate and the actions of non-members toward those in the aggregate. The theory derived from this approach is correct if, and only if, this assumption holds. The equivalent logic holds true of other aggregates like “women,” “labor,”et al, with respect to the defining characteristic of each. Since this basic assumption is transparently false to the facts, holistic theories – beginning with Saint Simonian socialism, continuing with Marxism, syndicalism and the theories of Fourier, the Fabian socialists, Lenin, Sombart, Trotsky, and the various modern socialists and Keynesians – have had to make numerous ad hoc excuses for the “deviationism” practiced by some members of each aggregate and for the failure of each theory.

The Hans Lipschis Case

Is it proper in principle that Hans Lipschis, a former employee of Auschwitz and now ninety-three years old, be repatriated to Germany from the U.S. and tried as accessory in the murder of 300,000 inmates of the notorious World War II death camp? Yes. The postwar tribunals, notably at Nuremberg, reaffirmed the principle that “following orders” of duly constituted authority is not a license to aid and abet murder.

Lipschis’s defense is that he was a cook, not a camp guard. But a relatively new legal theory, used to convict another elderly war-crimes defendant, John Demjanjuk, is that the only purpose of camps like Auschwitz was to inflict death upon inmates. Thus, the defendant’s presence at the camp as an employee is sufficient to provide proof of guilt. Is this theory valid? No. A cook’s actions benefitted the inmates; a guard’s actions harmed them. If guards refused to serve, the camps could not have functioned. But if cooks refused to serve, the inmates would have died of starvation.

Verdicts such as that in the Demjanjuk case were undoubtedly born of the extreme frustration felt by prosecutors and men like Simon Wiesenthal and other Nazi hunters. It is almost beyond human endurance to have lived through World War II and then be forced to watch justice be cheated time after time after time. First the leading Nazis escaped or committed suicide. Then some of them were recruited to aid Western governments. Then some were sheltered by governments in South America and the Middle East. Over time, attrition eventually overtook figures such as Josef Mengele. Occasionally, an Adolf Eichmann was brought to justice – but even he had to be kidnapped by Israeli secret agents before he could be prosecuted. Now the job of legally proving actual criminal acts committed by minor functionaries fifty, sixty or seventy years after the fact becomes too difficult. So we cannot be surprised when desperate prosecutors substitute legal fancies for the ordinary rules of evidence.

Nevertheless, if the prosecution cannot prove that Lipschis committed actual crimes, then he must be acquitted. This has nothing to do with his age or the time lapse between the acts and the trial. Any other decision is a de facto application of the bogus principle of collective guilt.

Shinzo Abe and Guilt for Japanese Aggression in World War II

Japanese Prime Minister Abe is a classic politician. Like the Roman god Janus, he wears two faces, one when speaking abroad to foreign audiences and another when seeking reelection by domestic voters. His answers to questions about whether he was repudiating the stance taken by a previous Prime Minister in 1996 – that Japan was indeed guilty of aggression for which the Japanese government formally apologized – were delicately termed “equivocal” by the U.S. magazine U.S. News and World Report. That is a euphemism meaning that Abe was lying by indirection, a political tactic used by politicians the world over. He wanted his answer to be interpreted one way by Japanese voters without having to defend that interpretation to the foreign press.

Abe’s behavior was shameful. But that has absolutely nothing to do with the question of Japanese guilt for war crimes committed during and prior to World War II. That guilt was borne by specific individual Japanese and established by the Tokyo war-crimes tribunal. Indeed, one government spokesman eventually admitted this in just those words, albeit grudgingly, after Abe’s comments had attracted worldwide attention and criticism.

The implications of this are that Japanese today bear no “collective guilt” for the war crimes committed by previous Japanese. (It would be wrong to use the phrase “by their ancestors,” since presumably few Japanese today are related by blood to the war criminals of seventy or eighty years ago.) The mere coincidence of common nationality does not constitute common ancestry except in the broad cultural sense, which is meaningless when discussing moral guilt. Are we really supposed to believe, for example, that the surviving relatives of Jesse James or Billy the Kid should carry around a weighty burden of guilt for the crimes of their forebear? In a world where the lesson of the Hatfield’s and McCoy’s remains unlearned in certain precincts, this presumption seems too ridiculous for words.

Similarly, the fact that Japanese leaders in the 1920s, 30s and 40s were aggressively militaristic does not deny Japanese today the right to self-defense against a blatantly aggressive Chinese military establishment.

Much is made of Abe’s unwillingness to acknowledge the “comfort women” – women from Korea, China and other Asian nations who were held captive as prostitutes by Japanese troops. Expecting politicians to behave as historians is futile. If Japanese war criminals remain at large, apprehend and indict them. If new facts are unearthed about the comfort women or other elements of Japanese war crimes, publish them. But using these acts as a club against contemporary Japanese leaders is both wrong and counterproductive.

Besides, it’s not as if no other ammunition was available against Abe. He has followed Keynesian fiscal policies and monetary policies of quantitative easing since his accession to prime minister. These may not be crimes against humanity, but they are crimes against human reason.

Macro vs. Micro

Academic economics today is segregated between macroeconomics and microeconomics. The “national economy” is the supposed realm of macroeconomics, the study of economic aggregates. But as we have just shown, it is the logic of individual responsibility that actually bears on the issue of war crimes committed by the nations of Germany and Japan – because the crimes were committed by individuals, not by “nations.” 

One of the most valuable lessons taught by classical economic theory is that the unit of analysis is the individual – in economics or moral philosophy.

DRI-179 for week of 5-3-15: Why Economics is Inseparable From Individual Responsibility

An Access Advertising EconBrief:

 Why Economics is Inseparable From Individual Responsibility

Many people know that the father of modern economics, Adam Smith, wrote An Inquiry Into the Nature and Causes of the Wealth of Nations in 1776. Few today realize that his most famous prior work was The Theory of Moral Sentiments in 1754. In Smith’s day, the conjunction of economics and moral philosophy was accepted, even taken for granted. Now economists are viewed as social scientists rather than philosophers, let alone moralists. Yet some of the most penetrating recent books and policy debates have revealed the economic underpinnings of genuine morality, rooted in the concept of individual responsibility.

Of all moral principles, individual responsibility may have been taken the worst beating at the hands of the 20th century. The chief abuser was Sigmund Freud, founder of the modern school of psychology and the profession of psychiatry. The book Admirable Evasions: How Psychology Undermines Morality is primarily an expose of the harm wrought by Freud and his descendants. The author, Theodore Dalrymple, is a psychiatrist who has viewed the profession from the inside as a former prison doctor and psychiatrist in private practice. (“Dalrymple” is the pen name for Englishman Anthony Daniels, but to avoid confusion we follow the author’s convention in this article.) He wonders whether “Mankind…would…be the loser or the gainer… if all the anti-depressants and anxiolytics… were thrown into the sea… all textbooks of psychology were withdrawn and pulped” and “all psychologists ceased to practice.” He is in doubt despite the “modest contributions to the alleviation of suffering” by some areas of clinical psychological practice. This implies that the harm done by psychology must be both significant and ongoing.

The maxim “It takes one to know one” was never better illustrated than by Dalrymple. His only drawback is occupational tunnel vision; he gives short shrift to economic logic as the motive force behind the failure of psychology.

Freudian Fraud

Sigmund Freud, born in 1883 in Vienna, Austria, underwent conventional medical education and training in neurology. Based on his interviews of patients, he founded the study of psychoanalysis. The fundamental principle of psychoanalysis is that the analyst possesses certain a priori truths about the patient’s mental makeup that establish a hierarchical relationship between the two. The analyst should enjoy a position of dominance, which the patient will inevitably resist. Only submission will enable the analyst to unlock the complexes and neuroses that plague the patient. These afflictions are the result of result of sexual pressures emerging in early childhood, including the male Oedipus complex and female penis envy. Patients are powerless to perceive and grapple with these primal forces; only psychoanalysis can bring them to the surface and resolve their conflicts.

Does it occur to you to wonder how the psychoanalyst himself became immune to these primal forces, hence worthy of the dominant analyst’s role? Well, the analyst himself supposedly had his own analyst, but the infinite regression involved in this issue was one of many logical problems never resolved in Freudian theory.

The term “psychology” derives from the ancient word “psyche,” used to denote human consciousness. Freud divided the human psyche into three parts: the ego, or conscious mind that allows us to interact with reality; the id, or unconscious; and the superego, the way station between id and ego and repository of societal and parental norms that control our behavior.

The first half of the 20th century elevated Freud to the status of cultural hero and icon. In the second half, rigorous study of his career, methods and techniques left Freudian theory in tatters. Freud based his theories on a combination of empirical generalization from his case histories and speculative conjecture. Many a successful scientific theory has been built on less, but in Freud’s case the result was a mess. Freud’s case histories were published using pseudonyms, a commendable attempt to protect the personal privacy of his subjects. This delayed their investigation and study. Eventually, it became clear that they had little or no scientific validity because their results were not measurable, they could not be replicated and they did not seem to be robust. Freud’s famous concepts – id, ego, superego, Oedipus complex and penis envy – have all been dropped from the lexicon of modern psychiatry.

Indeed, psychiatric practice today owes almost nothing to Freudian method. It is divided between the biological practitioners and the behaviorists. The biologicals treat “mental illness” completely differently than Freud did. Instead of viewing it as a unique phenomenon of the psyche, they see it as simply another branch of modern medicine. Conditions like schizophrenia and manic depression (now called bipolar condition) are recognized as physical illnesses caused by chemical imbalances within the brain; they are treated with prescription medicines. This reinforces the logic of training psychiatrists as medical doctors rather than wizards of the psyche. Behaviorists talk with patients about their problems and help them cope with those problems – in this they bear a superficial resemblance to psychoanalysts. But there is no hierarchical relationship and no a priori theory about the origin of those problems. Moreover, behaviorists must be on the lookout for psychological problems with a biological source.

Where does psychoanalysis fit into this modern paradigm? It doesn’t. Maybe we should call psychoanalysts as an endangered species – but there isn’t much impetus to preserve the species. Psychology is a profit-motivated profession. If psychoanalysis were capable of curing patients by resolving their problems rather than merely relieving them of an overburdened wallet, it would be thriving today. Instead, psychoanalysis is facing extinction.

If the commission of pseudoscience were Freud’s only sin, he would have slipped quietly into obscurity by now. Alas, this is the least of Freud’s mistakes. Sigmund Freud’s legacy lives on in ways that Freud himself hardly intended and would not have approved.

The Unintended Consequences of Freudian Psychology

Among Freud’s contentions were that sexually restrictive social mores created neuroses and inhibitions that repressed natural human behavior. In his day, this made Freud a name as a libertine. This label was false, for Freud was sexually quite strait-laced and conventional. As Theodore Dalrymple acutely observes, the “profoundly subversive” element of Freudian theory was “that desire, if not fulfilled, will lead to pathology… [This] makes self-indulgence man’s highest goal. It is a kind of treason to the self, and possibly to others, to deny oneself anything” [emphasis added]. Dalrymple supplies a chilling example of this philosophy in action: “[Dalrymple] quotes one of his patients, a murderer: ‘I had to kill her, doctor, or I don’t know what I would have done.'”

The idea that customs, traditions and morality evolve because they have value – survival value and competitive value in fulfilling human desires – may not have occurred to Freud. It definitely did not occur to his many successors, who were determined to engineer human evolution according to a central plan. The effects have been the reverse of those intended. Throughout the 20th century, Freudian psychology has walked side by side with Marxian philosophy and economics. Yet by encouraging people to shrug off the so-called “repression” of self that motivates respect for the rights and sensitivities of others, Freudianism has been the enabler of the self-absorption so often decried by critics of capitalist materialism.


The heir to Freudian psychology is the behaviorism of B.F. Skinner and his disciples. Here, Dalrymple deplores the behaviorist tendency to categorize every complaint as a “disorder,” subject to psychiatric eradication by behavior modification. “No statement that a psychiatric disturbance has such-and-such a prevalence in such-and-such a population should be taken at face value, especially when it is a plea, as it so often is, explicit or implicit as the case may be, for more resources to treat it, the supposed prevalence having risen shockingly in the last few years.”

Dalrymple is not merely questioning the statistical validity of this technique – although that is sufficient justification for the warning, since the bogus use of statistics has been biggest scandal of the last two decades in both the social sciences and the natural sciences. He is also further extending the Heisenberg principle that by investigating a phenomenon the scientist is also altering its course. “It is not merely that epidemiological searchers in this field can find what they are looking for; it is that they can provoke what they are looking for.” This principle cannot be stressed too strongly.

The social-welfare establishment has identified dozens of conditions requiring treatment. This treatment requires money and the existence of a bureaucratic establishment to provide, fund and supervise it. That establishment provides a living for many people. The “victims” of the conditions get real income in various forms: money, medical treatment and certified “victim” status as addicts or whatever the jargon term is for their condition.

And the victims also get a certified excuse for their misbehavior.

This is a form of real income that cannot be underestimated. Whereas in pre-psychology days, the victims were ostracized or otherwise discouraged from engaging in the behavior, now they are encouraged in it by the various subsidies provided. While proponents of the “therapeutic state” may indignantly object that nobody wants to be sick, objective research strongly confirms the role of incentives in enabling bad behavior.

This whole system has become self-promoting and self-aggrandizing. “The expansion of psychiatric diagnoses leads paradoxically and simultaneously to overtreatment and undertreatment. The genuinely disturbed get short shrift; Those with chronic schizophrenia, which seems most likely to be a genuine pathological malfunction of the brain [e.g., not “mental illness” at all but physical illness of the brain], are left to molder in doorways, streets and stations of large cities, while untold millions have their fluctuating preoccupations attended to with the kind of attention that an overconcerned mother gives her spoiled child with more or less the same results.”

The genuinely ill get less treatment because, being less able to earn income, they get less attention. The pseudo-ill are more able to command attention and show better “results” with less effort; therefore, they are easier and more satisfactory to “treat.”

Psychology is able to create the demand for its services by creating pseudo-illness. It does so, argues Mona Charen in her book review of Dalrymple in National Review, by “creating one excuse after another for bad behavior – our terrible childhoods, our genes, our neurotransmitters, our addictions. In each case, and often with extremely unscientific reasoning, we are offered absolution. None of us is really responsible for our behavior. The whole psychological enterprise, Dalrymple argues, has had the effect of excusing poor choices and bad character. ‘Virtue is not manifested in one’s behavior, always so difficult and tedious to control, but in one’s attitude to victims'”[emphasis added].

This book may have opened our eyes to the 20th century. But it was written by a psychiatrist. How does economics come into it?

The Economics of Individual (Ir-) Responsibility 

In both classical and neoclassic economics, the unit of analysis is the individual human being. (For immediate purposes, the separation between “classical” and “neoclassical” will be taken as the “Marginal Revolution” in the theory of consumer demand beginning roughly in the 1870s. This distinction is not important to what follows.) When the focus shifts to the theory of the firm, the unifying element is the assumption of profit maximization that directs the diverse strivings of the firm’s members toward a single goal.

Free markets are governed by the principle of mutually beneficial voluntary exchange. Mutual benefit provides the motivation to exchange voluntarily. There is a tacit presumption that each individual is responsible for his or her actions; that is, neither is liable for the actions of the other. This is entirely logical, since each one is the reigning expert on his or her wants, desires, shortcomings, plans and expectations. Neither can possibly know as much about the other as he or she knows about himself or herself. Thus, the concept of individual responsibility is an automatic byproduct of the philosophy of free markets.

No wonder, then, that Adam Smith trafficked in moral philosophy. The surprising thing is that somewhere along the way this got lost in the transition of economists to men in white coats peddling business forecasts of future growth rates of GDP and interest rates.

Contrast the relationship between human beings engaging in free trade and that between analyst and patient in today’s “therapeutic state.” The patient has a problem. No surprise there, since all of us do virtually all the time. The patient has an incentive to view this problem as beyond his control – if not a physical illness, then a neurosis, a complex, an addiction, a “sickness” of a metaphoric kind. The incentive is multi-pronged.

First, his lack of control relieves him of responsibility. He has no moral responsibility for having created, nurtured or tolerated it. Since he has no responsibility for it, he need feel no guilt over it.

Second, he now has a moral claim on the resources of others that did not previously exist. This claim is a form of real income that may become tangible if he can extract voluntary charity from them or involuntary payment in the form of government subsidies.

Third, his status as a moral claimant who suffers from a problem not of his own making makes him a victim. Victim status makes him a member of a recognized interest group. In addition to the possibility of extracting tangible real income via charity or government subsidies, he can also receive the psychic benefit that goes with public recognition as a member of a victim class.

Now shift attention to the analyst, whose incentives run parallel with those of the patient. He has an incentive to identify the patient’s problem as either a physical sickness or a psychic “mental illness.” Either way, this identification immediately relieves him of any guilt that might otherwise attach to treating the patient. Now he is merely a doctor treating a sick patient. He need feel no guilt over that.

And once his doctor status is secure, the analyst has no qualms about filing an intellectual lien on the assets of the public, either by appealing to their charitable sympathies of to their legal responsibilities as citizens and taxpayers.

Victims require saving. Saving requires saviors. Saviors are heroic figures. Thus, analysts earn psychic benefits from assuming heroic public status, just as patients gain psychic benefits from assuming victim status.

When two groups of people have so much to gain from pursuing a congruent sequence of activities, what does economic logic say will happen? The “equimarginal principle” – the fundamental principle of economic optimization underlying the theories of consumer demand, the firm and input supply – says that as long as the marginal benefit of an activity exceeds its marginal cost, economic actors will increase their pursuit of the activity. Indeed, if two non-competing groups find that their ends coincide, the groups may even collude, either openly or tacitly, to further those ends.

And that is just what has happened in mental health during the 20th century. Psychologists and patients have tacitly colluded to enlarge the “mental-health” establishment. That is what Theodore Dalrymple has had the temerity to point out in his politically incorrect book. Its political incorrectness is its outstanding virtue; its sole vice is its economic incorrectness. Where Dalrymple has made a literary-career specialty of telling unpopular and unpleasant truths about havoc wreaked by the pseudo-science of modern psychology, he has been unaccountably reticent in failing to disclose the economic logic underlying his position.

Why is it Important to Acknowledge the Role of Individual Responsibility in Economics?

In the most important excerpt quoted above, Dalrymple acknowledges that “the genuinely disturbed get short shrift.” These are people who suffer from psychoses formerly diagnosed as “mental illness” and treated with (utterly useless) psychotherapy. Thanks to the onetime heretics who refused to knuckle under to Freudian dogma, we now know that schizophrenia and manic depression (currently called bi-polar disorder) are neurochemical disorders of the brain. As is true with the most intractable physical disorders, we can offer only limited medical therapy for these conditions. But even this help is often denied to those who need it most.

Dalrymple rightly sees the outlines of the problem because he has spent a lifetime within the system as prison doctor and psychiatrist in private practice. As a resident of the U.K., he lived under Great Britain’s infamous National Health Service (NHS). He knows the workings of government the way a gulag prisoner knows the workings of a concentration camp. But it would be expecting too much to hope that a man who spent his life acquiring expertise in medicine and psychiatry and emerged alive from the toils of NHS should also be conversant with economic theory.

The reason for the denial of therapy to the “genuinely disturbed” is straightforward. The victims are unable to act as their own advocates. The treatment of so-called mental illness is plagued by a version of Gresham’s Law (“bad money drives out good money”), in which bad therapy drives out good therapy. The pseudo-victims are the squeaky wheels, greased by their own financial and political resources and the very fact that their lack of true illness yields better “results” from treatment. Because the treatment of mental illness is a jealously guarded prerogative of government and government budget-allocation is a jealously guarded prerogative of politicians, funds allocated to the treatment of the truly psychotic are a small slice of an already-small pie.

Individual responsibility is vital to the operation of civil society. It goes hand-in-hand with human freedom and free markets. But it breaks down in the rare – but real – cases where individuals are incapable of acting in their own behalf.

As things stand, government is the agency designated to act for those who cannot act for themselves. For example, children cannot enter into contracts for employment without the consent of their parents or guardian. Just to make sure that this position is not abused, children’s earnings are subject to protection by trusts. Child-welfare agencies also exist (ostensibly) to prevent other types of abuse. But when it comes to mental health, government is a walking, talking, breathing conflict of interest. Essentially, it is in the same conflicted position as the analyst because government is not a neutral party. It does not act for “the common good” because there is no “common good” – there are only diverse goods. This diversity can be reconciled only by a mechanism that allows relative value to be placed on each good so that the tradeoffs required by the reconciliation can be made efficiently and consistently. When government becomes the arbiter in a situation when its decision can produce more government, it always decides in favor of government intervention. (The only exception is when it is called upon to perform a true function of government, which would require a sacrifice of some other non-essential government activity – in which case it always chooses the non-essential over the essential.) Relying on government, with its built-in conflict of interest, is what got us in the fix we’re in.

When people cannot act in their own behalf, somebody must act for them. Their closest relatives or spouse are the first place to turn. When they cannot or will not act and government is disqualified, the only alternative is private charity.

Why has the word “charity” acquired a pejorative tinge? After all, research shows that Americans are very much inclined to support charitable causes. The problem is that too many Americans are still bewitched by the wish-fulfillment fantasy of government as problem-solver of first resort. Were government confined to its true functions, we would have the additional real income and discretion with which to solve the problems that government is now purporting – but failing – to solve.

As Dalrymple notes, the paradigm for any problem relating to health is to identify a “new” disorder, spread the alarm about its “epidemic” status and demand (what else?) government action at once, if not sooner. The good news about Dalrymple’s book is that the “problem” is vastly smaller than advertised. The bad news is that a real problem exists that is not being addressed and is immune to government action. In fact, the best thing would be to keep government away altogether. The worst news of all is that the attempt to solve the non-existent problem has created a worse one – the erosion of the irreplaceable concept of individual responsibility.

The key to sorting all this out is the economic logic underlying it all.