DRI-248 for week of 10-19-14: The Economic Inoculation Against Terrorism

An Access Advertising EconBrief:

The Economic Inoculation Against Terrorism

Last week’s EconBrief analyzed the military adventures undertaken by Great Britain and the United States over the last two centuries and found uncanny and unsettling similarities. In particular, we detected a growing tendency to intervene militarily to settle disputes coupled with a growing distaste for war per se. Given the lack of close substitutes for complete victory in military conflict, this is a disastrous combination.

Both Great Britain and the United States found increasing need to use military force but were increasingly reluctant to apply maximum force with promptness and dispatch. The British dithered when confronted by Islamic fanaticism in the Sudan and ended up suffering the loss of a national hero, vast prestige and the need to intervene finally anyway. The British then faced one revolt after another in southern Africa, Ireland, India and Palestine. In each case, they reacted in measured ways only to be excoriated when finally forced to take stronger action. Ultimately, they abandoned their empire rather than take the actions necessary to preserve it.

Compare the actions taken by Great Britain in India against the passive resistance led by Gandhi with those that would have been taken by, say, a totalitarian nation like Nazi Germany, Soviet Russia or Communist China. The British were repeatedly forced to back down from using force against Gandhi – not by superior force or numbers wielded by the Mahatma but by their own moral qualms about exerting the force necessary to prevail. By contrast, Gandhi would never have gained any public notice, let alone worldwide acclaim, had he lived and operated under the Third Reich. Hitler’s minions would have murdered him long before he rose to public prominence. In Soviet Russia, Gandhi would have earned a bullet in the back of his head and unmarked burial in a mass grave. In Red China, Gandhi would either have undergone re-education or joined the faceless millions on the funeral pyre in tribute to revolutionary communism.

Lawrence in Arabia: Visionary or Myopic Mystic?

Director David Lean’s magnificent film Lawrence of Arabia acquainted the world with the story of British Col. T.E. Lawrence, an obscure officer who seized the opportunity to unite disparate and warring Arab tribes in guerilla warfare against the Germans in the Ottoman Empire during World War I. Playwright Robert Bolt’s screenplay depicts the Arabs as simple, childlike victims of wily colonial exploiters. Lawrence is a martyr who seeks to restore Arabs to their former historical glory by casting out the foreign devils from Arabia – “Arabia for the Arabs.”

Lawrence is continually frustrated in his campaign to organize the Arabs into an effective and cohesive fighting force. Tribal and religious divisions separate Arabs from each other almost as much as from the Turks.  Why can’t they view themselves as Arabs, he wonders, rather than as members of particular tribes or sects?

When Lawrence succeeds in whipping his guerilla force into fighting shape, he turns them into a virtual column of the British army and becomes instrumental in winning the war in the Middle East. He assumes that, once united in war, the Arabs will remain so in peacetime. They will stand fast against the British and French colonialists and reclaim their heritage. When this hope proves illusory, he retreats home to England in disillusion.

The perspective of economics allows us the insight that Lawrence was doomed to disappointment from the start. In wartime, people of all races, creeds and nationalities are able and willing to put aside personal priorities in favor of the mutual overriding priority of winning the war. At war’s end, however, there is no longer any single overriding priority strong enough to claim universal allegiance. Now each pursues his or her own interest. Of course, this pursuit of individual interest can still produce broadly beneficial results. Indeed, it should do just that – provided the disciplining forces of free markets and competition are given free play. But in post-World War I Arabia, the ideas of Adam Smith and free markets were as alien as Dixieland jazz. Economically, Arabia was primitive and aboriginal. Its tribes were dedicated to warfare and plunder – just as the aboriginal peoples of Australia, New Guinea, North America, South America and Africa were before modern civilization caught up with them. There was a tradition of trade or exchange in aboriginal culture – but no tradition of freedom, free markets and property rights.

The Flame that Ignited the Arab Spring

Of course, Arab society did not stall out completely at the aboriginal stage of primitive, nomadic desert life. Arabs were naturally blessed with copious quantities of petroleum, the vital economic resource of the 20th century. Though mostly unable to develop this resource themselves, they did play host to companies from Western industrialized nations that created infrastructure for that purpose. The resulting cultural interaction paved the way for modernization and a measure of secularization. Thus, from a distance the major cities of the Middle East might be hard to distinguish from those of the West. Up close, though, the differences are stark.

The noted South American economist and political advisor Hernando De Soto led a joint research study into the origins of the Arab Spring of 2011. He recounted his experiences in the recent Wall Street Journal op-ed, “The Capitalist Cure for Terrorism” (Saturday-Sunday, October 11-12, 2014). The seminal event of this movement was the self-immolation of a 26-year-old Tunisian man named Mohamed Bouazizi. Judging from Western coverage of the Middle East, one would expect him to have been unemployed, disaffected and despairing of his plight. As De Soto and his team discovered, the truth was far different.

Bouazizi was not unemployed. He was a street merchant, one of the most common occupational categories in the Arab world. He began trading at age 12, graduating to the responsible position of bookkeeper at the local market by the time he was 19. At the time of his death, he was “selling fruits and vegetables from different carts and sites;” i.e., he was a multi-product, multiple-location entrepreneur. It seems clear that he was not driven to extremity by idleness and despair. So what drove him to public suicide?

Like most of his trade, Bouazizi operated illegally. His dream was to obtain the capital to expand his business into the legal economy. He wanted to buy a pickup truck for delivering his vegetables to retail outlets. He longed to form a legal company as an umbrella under which to operate – stake clear title to assets, establish collateral, get a loan for the truck.

This dream seems modest to America ears. But for Bouazizi it was unattainable. “Government inspectors made Bouazizi’s life miserable, shaking him down for bribes when he couldn’t produce [business] licenses that were (by design) virtually unobtainable. He tired of the abuse. The day he killed himself, inspectors had come to seize his merchandise and his electronic scale for weighing goods. A tussle began. One municipal inspector, a woman, slapped Bouazizi across the face. That humiliation, along with the confiscation of just $225 worth of his wares, is said to have led the young man to take his own life.”

“Tunisia’s system of cronyism, which demanded payoffs for official protection at every turn, had withdrawn its support from Bouazizi and ruined him. He could no longer generate profits or repay the loans he had taken to buy the confiscated merchandise. He was bankrupt, and the truck that he dreamed of purchasing was now also out of reach. He couldn’t sell and relocate because he had no legal title to his business to pass on. So he died in flames – wearing Western-style sneakers, jeans, a T-shirt and a zippered jacket, demanding the right to work in a legal market economy.”

Asked if Bouazizi had left a legacy, his brother replied, “Of course. He believed the poor had a right to buy and sell.”

Mohamed Bouazizi was not alone. In the next two months, at least 63 people in Tunisia, Algeria, Morocco, Yemen, Saudi Arabia and Egypt set themselves afire in imitation of and sympathy with Bouazizi. Some of them survived to tell stories similar to his. Their battle cry was “we are all Mohamed Bouazizi.” It became the rallying cry of the Arab Spring, bringing down no fewer than four political regimes.

The Western news media have been heretofore silent about the true origins of the Arab Spring. It did not originate in “pleas for political or religious rights or for higher wage subsidies.” None of the “dying statements [of the 63] referred to religion or politics.” Instead, the survivors spoke of “economic exclusion,” a la Bouazizi. “Their great objective was ‘ras el mel‘ (Arabic for ‘capital’), and their despair and indignation sprang from the arbitrary expropriation of what little capital they had.”

Das Kapital or Capital?

Nobody speaks with greater force on this subject than Hernando De Soto. He is the Latin American Adam Smith, the South American champion of free markets and property rights. He is now the world’s leading property-rights theorist, having ascended upon the deaths of Ronald Coase and Armen Alchian. And he put his own ideas into successful practice in his home country of Peru by leading the world’s only successful counter-terrorist movement in the 1980s.

The Shining Path was a Marxist band of terrorist revolutionaries who tried to overthrow the Peruvian government in the 1980s. They were led by a onetime university professor named Abimael Guzman. Guzman posed as the champion of Peru’s poor farmers and farm workers. He organized Peru’s Communist Party around the idea of massive farming communes and used the Shining Path as the recruiting arm for these communes. Some 30,000 resistors were murdered. Officials were kidnapped and held for ransom. This strategy gave Shining Path control of the Peruvian countryside by 1990.

De Soto was the government advisor charged with combatting Shining Path. He didn’t forswear the use of military force, but his first move was toward the library and the computer rather than the armory. “What changed the debate, and ultimately the government’s response, was proof that the poor in Peru weren’t unemployed or underemployed laborers or farmers, as the conventional wisdom held at the time. Instead, most of them were small entrepreneurs, operating off the books in Peru’s ‘informal economy.’ They accounted for 62% of Peru’s population and generated 34% of its gross domestic product – and they had accumulated some $70 billion worth of real-estate assets [emphasis added].

This new learning completely confuted the stylized portrayal of poverty depicted by Guzman and his Shining Path ideologues. It enabled De Soto and his colleagues to do something that is apparently beyond the capabilities of Western governments – eliminate three-quarters of the regulations and red tape blocking the path of entrepreneurs and workers, allow ordinary citizens to file complaints and legal actions against government and provide formal recognition of the property rights of those citizens. An estimated 380,000 businesses and 500,000 jobs came out of the shadows of the informal economy and into the sunlight of the legal, taxed economy. One result of this was an extra $8 billion of government revenue, which rewarded government for its recognition of the private sector.

Having put the property rights of the poor on a firm footing, De Soto could now set about eradicating Shining Path, confident that once it won the guerilla war it would not lose the peace that followed. In true free-market fashion, Peru reworked its army into an all-volunteer force that was four times its previous size. They rapidly defeated the guerillas.

In this connection, it is instructive to compare the effect of military intervention in Peru with that undertaken elsewhere. The military interventions undertaken by the U.S. and earlier by Great Britain served to recruit volunteers for terrorist groups by creating the specter of a foreign invader imposing an alien ideology on the poor. In Peru, volunteers flocked to an anti-terrorist cause that was empowering them rather than threatening them, enriching them and their neighbors rather than bombing them.

Peru stands out because the economic medicine was actually given. Other links between poverty, terrorism and lack of property rights can be cited. In the 1950s and 60s, Indonesia was home to Communist and terrorist movements. It was also a land that consistently thwarted its entrepreneurs, many of whom were immigrant Chinese, in ways reminiscent of an Arab state. The southern half of Africa has long been known for stifling entrepreneurship through bureaucratic controls and monopoly, often combined with nepotism and corruption. This began as a colonial inheritance and has passed down to the line of despots that has ruled Africa since the advent of independence.

All We Are Saying Is Give Economics a Chance

The American public is repeatedly sold the proposition that the world is dangerous and becoming more so with each passing day. Alas, the kind of military interventions practiced by the U.S. have not lessened the danger in the past and have, in fact, increased it. The only tried-and-true, time-tested solution to the problems posed by terrorism is economic, not military. We refer retrospectively to World War II as “the good war” because our cause seems so unimpeachably just when juxtaposed alongside the evils of Fascism and the Holocaust. But it is not moral afflatus and good intentions that justify war. It is the postwar economic miracles worked in German and Japan that set an invisible seal on our rosy memories of World War II. By contrast, for example, the defeat of Germany in World War I now seems Pyrrhic because the war and subsequent draconian peace terms produced Germany’s interwar economic upheaval and resulting lurch into Fascism.

The evil of war lies in the rarity of its success, not the oft-cited barbarity of its practice. The U.S. went to war in Korea, Vietnam, Kuwait, Iran and Afghanistan to counter real evils. We enjoyed considerable military success and achieved some of our goals. But we did not achieve victory. Last week’s EconBrief reminds us how overwhelmingly difficult it was even for Great Britain and the U.S. – each far and away the foremost military power of its day – to achieve their ends through war. Only in South Korea was long-term success attained, and there it was due to economic victory rather than military victory.

Careful study of world poverty and terrorism will uncover an economic phenomenon, against which military measures are largely unavailing and police tactics are merely a stopgap.

“They” Can’t Adapt to Free Markets and Institutions

One entrenched obstacle to adopting Hernando De Soto’s game plan against terrorism is the conventional thinking that certain cultures are inherently unable to absorb the principles of economics and free markets. This argument is so vaguely made that it is never clear whether proponents are arguing the genetic or cultural inferiority of the affected peoples. Recently it has been applied to former Soviet Russia when attempts to acclimate the Russian people to free markets failed. The interesting thing about this episode is that it began with the proposition that Western economic consultants could design market institutions and then superimpose them on the Russian people. In other words, elite analysts began by assuming that Russians could easily adapt to whatever economic system was designed by others for their benefit, but then took the polar opposite position that Russians were incapable of adapting to free markets. No provision was made for the possibility that – having lived for centuries under rigid autocracy – Russians might need time to adapt to free institutions.

For centuries, Chinese were considered inferior and suitable only for low-skilled labor. That is the task to which most immigrant Chinese were consigned in 19th-century America. While Chinese in China failed to achieve economic development throughout most of the 20th century, immigrant Chinese were the world’s great ethnic economic development success story. Eventually Taiwan and mainland China joined the ranks of the developed world and another development myth bit the dust.

When the short-term results of the Arab Spring dislocations disappointed many in the West, Arabs became the latest people accorded the dishonor of being deemed unable to accommodate freedom and free markets. Perhaps the most concise response to this line of thought was given indirectly by Arab leaders responding to De Soto’s charge that their countries lacked the legal infrastructure to bring the poor into the formal economy. “You don’t need to tell us this,” one replied. “We’ve always been for entrepreneurs. Your prophet chased the merchants from the temple. Our prophet was a merchant!” In other words, the Arab tradition accommodates trade, even if their legal system is hostile to it.

Once again, this space stresses the distinction between the Rule of Law – which abhors privilege and worships freedom – and mere adherence to statutory law – which often cements tyranny into place.

Bringing Free Markets and Property Rights to the Middle East

As far as Western elites and the Western news media are concerned, the only kind of Middle East economic reform worth mentioning is foreign aid. But over a half-century of government-to-government foreign aid has proven to be an unqualified disaster. Economists like William Easterly and the late Lord Peter Bauer have written copiously on the pretentions of Western development economists and the corruption of Western development agencies. This is the deadest of dead ends.

De Soto’s approach is the only institutional approach worth considering. Apparently, it is actually receiving consideration by the beneficiaries of the Arab Spring. Egypt’s President, Abdel Fattah Al Sisi, commissioned De Soto and his team to study Egypt’s informal economy. That study found that Egypt’s poor get as much income from capital, in the informal economy, as they do from salaries in the formal economy. More precisely, some 24 million salaried citizens earn about $21 billion per year in salaries while owning some $360 billion in unrecognized assets that throw off roughly an equivalent amount of yearly income. As De Soto recognizes, this income is approximately 100 times the total of all Western financial, military and development aid to Egypt. It is also “eight times more than the value of all foreign direct investment in Egypt since Napoleon invaded more than 200 years ago.”

The problem is that much of this value is locked up in bureaucratic limbo. “It can take years to do something as simple as validating a title in real estate.”

 This is the real secret to achieving economic development in the Middle East. It is also the secret to fighting terrorism and preserving American security.

DRI-306 for week of 6-2-13: What Is (Or Was) ‘American Exceptionalism’?

An Access Advertising EconBrief:

What Is (Or Was) ‘American Exceptionalism’?

Ever since the 1970s, but increasingly since the financial crisis of 2008 and ensuing Great Recession, eulogies have been read for American cultural and economic preeminence. If accurate, this valedictory chorus would mark one of the shortest reigns of any world power, albeit also the fastest rise to supremacy. Even while pronouncing last rites on American dominance, however, commentators unanimously acknowledge our uniqueness. They dub this quality “American exceptionalism.”

This makes sense, since you can’t very well declare America’s superpower status figuratively dead without having some idea of what gave it life in the first place. And by using the principles of political economy, we can identify the animating features of national greatness. This allows us to perform our own check of national vital signs, to find out if American exceptionalism is really dead or only in the emergency room.

Several key features of the American experience stand out.

Free Immigration

Immigration (in-migration) fueled the extraordinary growth in U.S. population throughout its history. Immigration was mostly uncontrolled until the 1920s. (The exception was Chinese immigration, which was subject to controls in the late 19th century.) Federal legislation in the 1920s introduced the concept of immigration quotas determined by nation of origin. These were eventually loosened in the 1960s.

From the beginning of European settlement in the English colonies, inhabitants came not only from the mother country but also from Scotland, Ireland, Wales, the Netherlands, Spain, France, Germany and Africa. Scandinavia soon contributed to the influx. Some of the earliest settlers were indentured servants; slaves were introduced in the middle of the 17th century.

Today it is widely assumed that immigrants withdraw value from the U.S. rather than enhancing it, but this could hardly have been true during colonial times when there was little or no developed economy to exploit. Immigrants originally provided the only source of labor and have continued to augment the native labor supply down to the present day. For most of American history, workers were drawn to this country by wages that were probably the highest in the world. This was due not just to labor’s relative scarcity but also to its productivity. Immigrants not only increased the supply of labor (in and of itself, tending to push wages down) but also complemented native labor and made it more productive (tending to push wages up). The steady improvements in technology during the Industrial Revolution drove up productivity and the demand for labor faster than the supply of labor increased, thereby increasing real wages and continually drawing new immigrants.

Economists have traditionally underrated the importance of entrepreneurship in economic development, but historians have noted the unusual role played by Scottish entrepreneurs like Andrew Carnegie in U.S. economic history. At the turn of the 20th century, the business that became the motion-picture industry was founded almost entirely by immigrants. Most of them were Jews from Eastern Europe who stepped on dry land in the U.S. with no income or assets. They built the movie business into the country’s leading export industry by the end of the century. In recent years, Asians and Hispanics have taken up the entrepreneurial slack left by the native population.

An inexplicably ignored chapter in U.S. economic history is the culinary (and gastronomic) tradition linked to immigration. Early American menus were heavily weighted with traditional English dishes like roast beef, breads and puddings. Soon, however, immigrants brought their native cuisines with them. At first, each ethnic enclave fed its own appetites. Then immigrants opened up restaurants serving their own. Gradually, these establishments attracted native customers. Over decades, immigrant dishes and menus became assimilated into the native U.S. diet.

Germans were perhaps the first immigrants to make a powerful impression on American cuisine. Many Germans fought on the American side in the Revolution. After independence was won, a large percentage of opposing Hessian mercenaries stayed on to make America their home. Large German populations inhabited Pennsylvania, Illinois and Missouri. The so-called Pennsylvania Dutch, whose cooking won lasting fame, were German (“Deutsch”).

In the 19th century, hundreds of thousands of Chinese laborers came to the U.S., many to work on western railroad construction. They formed Chinese enclaves, the largest one located in San Francisco. Restaurants serving regional Chinese cuisines sprang up to serve these immigrants. When Americans displayed a taste for Chinese food, restaurateurs discovered that they had to tailor the cooking to American tastes, and these “Chinese restaurants” served Americanized Chinese food in the restaurant and authentic Chinese food in the kitchen for immigrants. Today, this evolutionary cycle is complete; American Chinese restaurants proudly advertise authentic dishes specialized along Mandarin, Szechuan and Cantonese lines.

Meanwhile, back in the 1800s, Italians were emigrating to America. Italian food was also geographically specialized and subsequently modified for American tastes. Today, Italian food is as American as apple pie and as geographically authentic as its Chinese counterpart. The Irish brought with them a simple but satisfying mix of recipes for starches and stews. Although long restricted to cosmopolitan coastal centers, French cooking eventually made its way into the American diet.

Mexicans began crossing the Rio Grande into the U.S. during the Great Depression. Their numbers increased in the 1950s, and this coincided with the advent of Mexican food as the next great ethnic specialty. Beginning in the late 1960s and coinciding with the rise of franchising as the dominant form of food retailing, Mexican food took the U.S. palate by storm. It followed the familiar pattern, beginning with Americanized “Tex-Mex” and culminating with niche Mexican restaurants catering to authentic regional Mexican cuisines.

Today, restaurant dining in America is an exercise in gastronomic globe-trotting. Medium-size American cities offer restaurants flying the ethnic banners of a dozen, fifteen or twenty nations – not just Italian, Chinese and Mexican food, but the dishes of Spain, Ethiopia, Thailand, Vietnam, Ireland, India, Greece, Denmark, the Philippines, Germany and more.

Immigration was absolutely necessary to all this development. As any experienced cook can attest, simple copying of recipes could not have reproduced the true flavor of these dishes, nor could non-natives have accomplished the delicate task of modifying them for the American palate while keeping the original versions alive until they eventually found favor with the U.S. market.

It is ironic that so much debate focuses on the alleged need for immigrants to assimilate U.S. culture. This single example shows how America has assimilated immigrant culture to a far greater degree. Indeed, American culture didn’t exist prior to immigration and has been created by this very assimilation process. Now, apart from learning English, it is not clear how much is left for immigrants to assimilate. For example, consumer products like Coca-Cola and McDonald’s hamburgers have become familiar to immigrants before they set foot here through U.S. exports.

Cultural Heterogeneity

Many of the great powers of the past were trading civilizations, like the Phoenicians and the Egyptians. By trading in the goods and languages of many nations, they developed a cosmopolitan culture.

In contrast, physical trade formed a fairly modest fraction of economic activity in the U.S. until well into the 20th century. The U.S. achieved its cultural heterogeneity less through trade in goods and services than via trade in people. The knowledge and experience shared by immigrants with natives produced a similar result.

Economists have long known that these two forms of trade substitute for each other in useful ways. For example, efficient use of a production input – whether labor, raw material or machine – requires that its price in different locations be equal. Where prices are not equal, equalization can be accomplished directly by movements of the input from its low-priced location to its high-priced location, which tends to raise the input’s price in the former location and lower it in the latter location. Or, it can be accomplished indirectly by trade in goods produced using the input; since the good will tend to be cheaper in the input’s low-priced location, exports to the high-priced location will tend to raise the good’s price, the demand for the input and the input’s price in that location.

Input-price equalization is a famous case of trade in goods obviating the necessity for trade in (movement of) people. Cultural heterogeneity is a much less well-known case of the reverse phenomenon – immigration substituting for trade in goods.

The importance of cultural heterogeneity has been almost completely overshadowed by the modern obsession with “diversity,” which might be concisely described as “difference for difference’s sake.” Unlike mindless diversity, cultural heterogeneity is rooted in economic logic. Migration is governed by the logic of productivity; people move from one place to another because they are more productive in their new location. Estimates indicate, for example, that some low-skilled Mexican workers are as much as five times more productive in the U.S. than in Mexico.

That is only the beginning of the benefits of migration. Because workers often complement the efforts of other workers, immigration also raises the productivity (and wages) of native workers as well. And there is another type of benefit that is seldom, if ever, noticed.

The late, great Nobel laureate F.A. Hayek defined the “economic problem” more broadly than merely the efficient deployment of known inputs for given purposes. He recognized that all individuals are limited in their power to store, collate and analyze information. Consumers do not recognize all choices available to them; producers do not know all available resources, production technologies or consumer wants. The sum of available knowledge is not a given; it is locked up in the minds of billions of individuals. The economic problem is how to unlock it in usable form. That is what free markets do.

Our previous extended example involving immigration and the evolution of American cuisine illustrates exactly this market information process at work. The free market made it efficient and attractive for immigrants to come to the U.S. U.S. consumers became acquainted with a vast new storehouse of potential consumption opportunities – eventually, U.S. entrepreneurs could also mine this trove of opportunity. Immigrant producers became aware of a new source of demand and new inputs with which to meet it. And the resulting knowledge became embedded in the mosaic of American culture, making our cuisine the most cosmopolitan in the world.

The upshot is that, without consciously realizing it, Americans have had access to vast amounts of knowledge, expertise and experience. This store of culture has acted as a kind of pre-cybernetic Internet, the difference being that culture operates outside our conscious perception. At best, we can observe its residue without directly measuring its input. One way of appreciating its impact is to compare the progress of open societies like the U.S. with civilizations that were long closed to outside influence, like Japan and China. Isolation fearfully retarded economic development.

Status Mobility

In his recent book, Unintended Consequences, financial economist Edward Conard stresses the necessity of risk-taking entrepreneurial behavior as a source of economic growth. The risks must be organically generated by markets rather than artificially created by politicians; the latter were the source of the recent financial crisis and ensuing Great Recession.

According to Conard, it is the striving for status that drives entrepreneurs run big risks in search of huge rewards that few will ultimately attain. Status may take various forms – social, occupational or economic. Its attraction derives from the human craving to distinguish oneself. It is this need for disproportionate reward – whether measured in esteem, dollars or professional recognition – that balances the high risk of failure associated with big-league entrepreneurship.

In the U.S., status striving has long been ridiculed by sociologists and psychologists. “Keeping up with the Joneses” has been stigmatized as a neurotic preoccupation. Yet the American version of status compares favorably with its ancient European ancestor.

England is famous for its class stratification. A half-century ago, its “angry young men” revolted against a stifling class system that defined status at birth and sharply limited upward mobility. Elsewhere in Europe, lingering remnants of the feudal system remained in place for centuries.

But the U.S. was comparatively classless. Economics defined its classes, and the economic categories embodied a high degree of mobility. Even those who started on the bottom rung usually climbed to the higher ones, where the rarefied climate proved difficult to endure for more than a generation or two.

The best feature of the status-striving U.S. class system has been the broad distribution of its benefits. The unimaginable fortunes acquired by titans of Industry like Carnegie, Rockefeller, Gates, Buffett, et al have made thousands of people rich while building a floor of real income under the nation. Our working lives and leisure have been defined by these men. The value created by a Bill Gates, say, is almost beyond enumeration.

Thus, it is not the striving for status per se that makes a national economy exceptional. It is the mobility that accompanies status. This will determine the form taken by the status striving process.

Before free markets rose to prominence, wealth was gained primarily through plunder. Seekers after status were warlords, kings or politicians. They gained their status at the expense of others. Today, plunder is the exception rather than the rule. Drug cartel bosses are the vestige of Prohibition; they profit purely from the illegalization of a good. Politicians are their counterpart in the straight world.

When status is accompanied by mobility, anybody can gain status. But they cannot have it without increasing the real incomes of large numbers of people. Ironically, the biggest complaint lodged against the American version of capitalism – that it promotes greed and income inequality – turns out to be both dead wrong and inaccurate. Mobility is achieved through competition and free markets, which absolutely demand that in order to get rich the status-striver must satisfy the wants of other people en masse. And income inequality is the inevitable concomitant of risk-taking entrepreneurship – somebody must bear the risks of ferreting out the dispersed information about wants, resources and technologies lodged in billions of human brains. If we don’t reward the person who succeeds in doing the job, the billions of people who gain from the process don’t get their real-income gains.

Free Markets

You might suppose that bureaucracy was invented by the New Deal. In fact, Elizabethan England knew it well. Price controls date back at least to the Roman emperor Diocletian. Prior to Adam Smith’s lesson on the virtues of trade and David Ricardo’s demonstration of the principle of comparative advantage, the philosophy of mercantilism held that government must tightly regulate economic activity lest it burst its bonds. Thus, free markets are a historical rarity.

England’s abolition of the Corn Laws in the mid-1800s provides a brief historical window on a world of free international trade, but the U.S. prior to 1913 probably best approximates the case of a world power living under free markets. Immigration was uncontrolled and tariffs were low; both goods and people flowed freely across political boundary lines.

Prices coordinate the flow of goods and services in the “present;” that is, over short time spans. Production and consumption over time are coordinated by markets developed to handle the future delivery of goods (futures and forward markets) and by prices that modify the structure of production and consumption in accord with our needs and wants for consumption and saving in the present and the future. For the most part, these prices are called interest rates.

Interest rates reflect consumers’ desires to save for future consumption and producers’ desires to invest to augment productive capabilities for the future. Just as a price tends to equalize the amount of a good producers want to produce and consumers want to purchase in a spot market, an interest rate tends to equalize the flow of saving by consumers with the investment in productive capital by producers. Without interest rates, how would we know that the amounts of goods wanted by consumers in the future would correspond to what producers will have waiting for them? As it happens, we are now experiencing first hand the answer to that question under the Federal Reserve’s “zero-interest-rate policy,” which substitutes artificial Federal Reserve-determined interest rates for interest rates determined by the interaction of consumers and producers.

Without knowing what policies were followed, we can scrutinize development outcomes in countries like China, India, Southeast Asia and Africa and draw appropriate inferences about departures from free-markets. High hopes and failure were associated with statism and market interference in China, India and Africa for over a half-century. Successful development has followed free markets like pigs follow truffles. But the obstacles to free markets are formidable, and no country has as yet found the recipe for keeping them in force over time.

What About Political Freedom?

Discussions of American exceptionalism invariably revolve around America’s unique political and constitutional history and its heritage of political freedom. Yet the preceding definition of exceptionalism has leaned heavily on economics. The world does not lack for political protestations and formal declarations of freedom and justice. Many of these are modeled on our own U.S. Constitution. History shows, though, that only a reasonably well-fed, prosperous population is willing to fight to preserve its political rights. Time and again, economic freedom has preceded political freedom.

When the level of economic development is not sufficient and free markets are not in place, the populace is not willing to sacrifice material real income to gain political freedom because it is too close to the subsistence level of existence already. And even in the exceptional case (usually in Africa or Latin America) in which a charismatic, status-striving leader heads a successful political movement, the leader will not surrender leadership status – even though the ostensible purpose of the independence movement was precisely to gain political freedom. Instead, he or she preserves that status by cementing political power for life. Why? Because there is no substitute status reward to fall back on; his or her economic and social status depends on wielding political power. This is the fault of the political Left, which has demanded that “mere” economic rights be subrogated to claims of equality, with the result that neither equality nor wealth has been realized.

Observation shows that when economic growth begins – but not before that – people begin to sacrifice consumption to control pollution and improve health. Similar considerations apply to political freedom. Expressing the relationship in the jargon of economics, we would say that political freedom is a normal good. This means that we “purchase” more of it as our real incomes increase. In this context, the word “purchase” does not imply acquisition with money as the medium of exchange; it means that we must sacrifice our time and effort to get political freedom, leaving less leisure time available for consumption purposes.

The U.S. was the exception because its economic freedom and real income was well advanced before the Revolution. Enough Americans were willing to oppose the British Crown to achieve independence because colonial America was living well above the subsistence level – at that, the ratio of rebels to Tories was close to even. George Washington was offered a crown rather than a Presidency, but he declined – and declined again when offered a third Presidential term. His Virginia plantation offered a substitute status reward; he did not need to hold office to maintain his economic status or social esteem. It is interesting to speculate about the content of the Constitution and the course of U.S. history had the U.S. lacked the firm economic foundation laid by its colonial history and favorable circumstances.

DRI-325 for week of 5-26-13: Stockman on Reagan: He Should Have Known

An Access Advertising EconBrief:

Stockman on Reagan: He Should Have Known

The publishing sensation du jour – at least in the field of politics and economics – is the cautionary memoir of former Reagan administration budget director David Stockman, entitled The Great Deformation. The title derives from the religious Great Reformation of the 16th century, which befits the missionary zeal Stockman brings to his tale. The titular deformation was suffered by American capitalism during the 20th century, but particularly during Stockman’s adult life.

Stockman’s book is part history, part gloomy prophecy and part score-settling with his critics. Each part has value for readers, but it is his historical recollections that deserve primary attention. Stockman assigns significant responsibility for the ongoing demise of free-market capitalism to policies initiated during the Reagan administration he served. One of those policies is widely considered to be President Reagan’s crowning achievement – the demise of the Soviet Union.

Stockman’s Revisionist View of Soviet Decline

The conventional account has the Soviet Union declining rapidly during the 1980s before finally toppling of its own weight between 1989 and 1991. The proximate cause was economic: the Soviet economy was so ponderously inefficient that it eventually lost the capacity to feed and clothe its own citizens, most of whom were forced to either stand in queues for hours daily at government stores or purchase basic goods at elevated prices in the black market. The government devoted most of its resources to producing military goods or subsidizing Communism abroad.  The lack of a functioning price system – current prices for goods and services and interest rates determining the value of capital goods – severed the link between the production of goods and services and the wants of the people, thereby leaving the economy adrift and floundering.

Stockman does not quarrel with this verdict. In fact, he endorses it so forcefully that he claims that the celebrated Reagan-administration defense build-up of the early 1980s was unnecessary and counterproductive. It was unnecessary because the Soviet economy was already collapsing of its own weight and, consequently, the Soviet military was no threat to the U.S. (The emphasis is mine.) It was counterproductive because military expenditures are an inherent drag on the production of goods and services for private consumption. And during the 1980s, we experienced what Stockman called “the greatest stampede of Pentagon log-rolling and budget aggrandizement by the military-industrial complex ever recorded.”

Stockman accuses Reagan defense Secretary Caspar Weinberger of selecting 7% annual increases in a baseline defense-spending figure of $142 billion annually out of a hat, simply because it represented the midpoint between candidate Reagan’s promised 5% annual increase and the 8-9% demanded by a hawkish group of advisors led by Senator John Tower of Texas. Stockman waxes at indignant length about the budgetary waste embedded in this program. He juxtaposes this fiscal profligacy alongside the Administration’s incapable efforts to cut spending on entitlement programs, which resulted in a feeble reduction of 1/3 of 1% in the percentage of GDP deployed by government.

Stockman goes on to record the fiscal depredations of the two Bush administrations that followed, curiously lauding the Clinton administration for its budget “surpluses” despite the fact that these were really accounting artifacts achieved by off-budget borrowing. He then describes, with mounting alarm, the fiscal death spiral executed by the Bush-Obama regimes – more properly linked as a hyphenate than were Reagan-Bush – which combined monetary excess with fiscal profligacy to nose-dive the U.S. economy into the ground.

The 742-page volume contains a wealth of valuable material, much of it insider information delivered by a participant in federal budgetary battles and Wall Street machinations. But the datum of immediate interest is Stockman’s putative debunking of Reagan’s role as Soviet dragon slayer.

History as Hindsight

David Stockman claims that the Soviet Union was in terrible economic shape when Ronald Reagan took office in January, 1981 – so bad that it presented no serious military threat to the U.S. Given the force of his argument, it may seem somewhat surprising that he presents no direct evidence to support this claim. It is not difficult to grasp the nature of his inferential case, though.

The Soviet Union’s political collapse began in 1989 and occurred with shocking suddenness. Even more stunning was its non-violence; hardly a shot was fired despite various confrontations involving tanks and troops. Stockman’s implicit argument presumably runs something like this: “In order for public mobilization against the government to have attained this critical mass by 1989, economic deterioration as of 1981 must have been well under way and quite advanced.” This is a reasonable inference. Moreover, it is supported by the research and release of documents that occurred in the window of time between 1991 and the onset of the Putin regime, when access to Soviet archives was closed to the West and even to Russian researchers.

But that isn’t all. Implicitly, Stockman continues along the following lines: “Because we now know that the Soviet Union was scheduled to fall apart beginning in 1989, it was therefore unnecessary for the Reagan administration to waste all that money on its defense build-up. And since that build-up was a major contributor to the ensuing decline and fall of the U.S. economy and free markets, Reagan himself must bear a big share of responsibility for the fix we are in today.” Obviously, the quoted paraphrase is my interpretation of Stockman’s argument; the reader will have to judge its fairness.

But if it is a fair rendering of Stockman’s case, then that case fails utterly. Each of the three elements comprising it is false. The first of these is the most obvious. Stockman has committed the fallacy of hindsight. Thirty years later, we now know that the Soviet Union was scheduled to fall apart in 1989. But in 1981, Ronald Reagan didn’t know it. In fact, nobody knew it.

Soviet Disintegration: The View From 1981

Based on David Stockman’s harsh judgment of Ronald Reagan’s conduct, one reads The Great Deformation with bated breath, waiting for the meeting with Reagan at which Stockman sternly declaims, “Mr. President, the Soviet Union is falling apart. You know it as well as I do. How dare you waste all this money on military spending? You’re going to spend us into the poorhouse. Thirty years from now, our economy will implode.”

But we wait in vain; no such passage is included in the book. Presumably, no such conversation took place because David Stockman was just as ignorant of the Soviet Union’s true economic status as Reagan was.

Actually, there is every reason to believe that Reagan was better informed than Stockman. At least two books have been written about Reagan’s campaign to win the Cold War, the better one being Reagan’s War, by Peter Schweizer. We learn that Reagan himself expected the Soviet economy to collapse; he was one of the few people outside the ranks of hard-core free marketers who did. What he didn’t know was when this would happen.

This is the economist’s eternal bugbear, after all, and Reagan was the only U.S. President to actually hold a degree in economics. Economists usually know what’s going to happen, but they are notoriously unable to predict the timing of events, which accounts for their lackluster forecasting reputation.

At the point of Reagan’s inauguration, the Soviet Union’s star was ascendant internationally. It had not yet retreated from Afghanistan and its advisors and acolytes were meddling in Third World countries around the world. The U.S. was under worldwide pressure to succumb to détente and negotiate away its nuclear superiority – the very factor that Stockman claims made its defense build-up superfluous.

Since Stockman didn’t know that the Soviet Union was a basket case but is implicitly saying that Reagan should have known it, he must mean that somebody else who did know it told him the truth, or should have told him. Who would this have been? The logical candidate would have been the CIA. But we now know that the CIA didn’t know it; their failure to provide advance warning of the Soviet collapse is proverbial.

Perhaps Stockman thinks that economists should have anticipated the economic collapse of the Soviet Union. At first thought, this doesn’t seem so unreasonable, does it? But the leading academic economist of the day, Paul Samuelson of MIT, is now famous – no, make that infamous – for including a running graph in succeeding editions of his best-selling college textbook that shows the Soviet Union overtaking the U.S. in per-capita economic growth during the 1980s. (Actually, the point of convergence was quietly moved farther back in time in later editions.) Clearly, Samuelson didn’t have a clue about actual economic growth in the Soviet Union or he never would have made a fool of himself in print for posterity.

Maybe some of Reagan’s free-market economist friends knew the truth, or should have known. Certainly F.A. Hayek and Ludwig von Mises knew that the Soviet economy was fated to collapse in the 1930s when they argued that economic calculation under socialism was impossible. But Mises was dead by 1981 and Hayek was in his 80s, only recently rehabilitated within the profession by the award of his Nobel Prize in 1974. In reality, free-market economists were demoralized by their ostracism from the profession and the seeming invulnerability of the Soviet Union to public criticism and the laws of economics. They had long ago predicted its demise, only to be confounded and humiliated when it kept rolling along – aided in no little part by the subsidies it received from Western governments and the praise it gained from Western intellectuals. Few, if any, free-market economists were optimistically predicting the end of Communism in 1981.

Realistically speaking, nobody should now expect Ronald Reagan to have predicted the future then. But suppose, hypothetically, he knew the general state of the Soviet economy. When then? The fact is that even this knowledge would not have made Stockman’s argument valid – just the opposite would seem to be true.

History is Not a Controlled Experiment

If the Soviet Union had been as impregnable as most of the world believed it to be, then there might have been a case for détente or a milder policy of rapprochement. Reagan didn’t know the truth, but he suspected that the Soviet economy was shaky. He told his advisors, “Here’s my strategy of the Cold War: We win, they lose.” He wanted to give the Soviet economy the push that would shove it over the cliff and destabilize the regime. He knew that forcing the Kremlin into an arms race would pose a fatal dilemma: Either the Soviet government would devote more resources to military production or it would refuse the challenge and devote more resources to civilian goods and services. The former choice would expose it to civil revolt and eventual rebellion. The later choice would condemn it to inferiority in conventional arms as well as nuclear capability; it would pose no threat to the U.S. or the rest of the world and could be isolated to die on the vine in good time.

The Soviet Union had killed upwards of 100 million people during the 20th century by means of execution, deliberately contrived famine and exile to gulags. Reagan felt that the Soviet economy would collapse eventually. But when? For all he knew, it might take ten years, twenty years, thirty years – after all, they had last 64 years up to that point and they had most of the world on their side. There seemed to be a window of opportunity to win the Cold War now, but he wouldn’t be President forever. If he failed to do the job on his watch, his successor(s) might give away whatever advantage he had gained. In the event, despite the single-minded dedication of Reagan and his small band of advisors, it took two full terms – nearly 9 years – to get the job done as it was.

World War II was a conventional war against totalitarianism, fought with conventional weapons against Hitler, Tojo and Mussolini. It was won by spending (and wasting) vast quantities of money on soldiers, bombs, ships, planes, tanks and the like. Most of these armaments were actually used for their intended purposes. Millions of people were killed in the process.

The Cold War was an unconventional war against totalitarianism, fought and won by Ronald Reagan by the unconventional means of spending his Soviet opponents into submission when they elected to compete with him militarily. The superior American economy provided the wherewithal to defeat the inferior Soviet economy. In the direct sense, nobody was killed in the process, although the tremendous waste involved did cost many lives. But because the Soviet Union killed millions of people directly and indirectly and would have continued to do so, Reagan’s actions saved many lives.

The futility of Stockman’s case is illustrated by his criticism of Reagan’s build-up of conventional weapons. Reagan had campaigned against the Soviet Union’s nuclear capability, Stockman maintained, yet insisted on spending vast sums on conventional weapons. “What actually kept the Soviets at bay was the retaliatory [capability] of submarine-based Trident missile warheads…along with…land-based minuteman ICBMs…This deterrent force was what actually kept the nation safe and had been fully in place for years.” In contrast, Stockman scoffed, “the $20 billion MX ‘peacekeeper’ missile…was an offensive weapon that undermined deterrence and wasn’t actually deployed until the Cold War was nearly over.” No, the MX contributed as much, if not more, to winning the Cold War than Polaris and the ICBMs because all these weapons were not used to fight a conventional war. They were not even valuable primarily for their deterrent effect, although that was also quite useful. They “fought” the war peacefully by forcing the Soviet Union to use resources to compete with them. Indeed, the greatest weapon of the Cold War was never even produced. Former Premier Mikhail Gorbachev and his advisors specifically cited the Strategic Defense Initiative (derisively termed “Star Wars” by its detractors) as the crucial factor in the Soviet Union’s demise.

At this juncture in the discussion, the point may seem almost too obvious to need stress: Ronald Reagan’s actions themselves contributed to the collapse of the Soviet Union and may well have been its proximate cause; therefore we cannot assume that the Soviet Union would have collapsed anyway if Reagan had not acted as he did.

 

The first element of Stockman’s case is that because we now know the Soviet Union was falling apart, Reagan should have known it and should have guessed that the Soviet Union would therefore collapse in 1989. This is false; it is the hindsight fallacy. The second element is that because the Soviet Union did collapse in 1989, it would have done so even if Ronald Reagan had not actively won the Cold War. This is not merely false; it is an absurdity since Ronald Reagan’s actions themselves contributed decisively to the speed and completeness of the collapse.

The Fiscal Fallout: Eisenhower vs. Bush

At this point, David Stockman’s case against Ronald Reagan as Cold Warrior is on the ropes. But there remains a counter-argument to the points raised in opposition. Even if Reagan did win the Cold War, it must have come at a terrible cost if it led on a direct line to the end of free-market capitalism in America and impending fiscal and monetary collapse, as David Stockman says it did. Who could argue with that?

David Stockman, as a matter of fact. Stockman himself provides the final refutation to his own argument against Reagan’s role in the Cold War. And, incredibly, he shows no realization of it. Stockman lists President Eisenhower among his heroes for the courage he displayed by taking on a job disdained by his predecessor, Harry Truman. Eisenhower and his Treasury Secretary, George Humphreys, recognized that wartime tax rates were far too high to promote prosperity. But they were determined to complement a reduction in tax rates with spending reductions that would bring government’s percentage take of GDP (then called gross national product or GNP) back into line with historic norms. So between them they hammered out over $145 billion in defense-budget reductions over three years that accomplished the de facto demobilization of the military.

Stockman is fortunate that the Reagan-era parallel was not a snake; else he would be in need of anti-venom serum. The Soviet Union’s collapse was not clear-cut until just after Reagan left office. Reagan had no chance to unwind the successful chain of events he had set in motion. Thus, to make Reagan’s triumph complete, his successor George Bush needed to do the same job Eisenhower did – namely, drastically downsize a military budget whose only rationale had been to win the Cold War non-violently. It was the cravenness and stupidity of the Bush Administration, not Ronald Reagan’s failings, which started the budget rot leading to our present problems. At least, that is the end product of David Stockman’s own logic.

The third element of David Stockman’s case – that Reagan’s fiscal program led directly to our present malaise – is refuted by Stockman’s own choice of Eisenhower as hero and Stockman’s explanation of Eisenhower’s defense-budget cuts. Bush, not Reagan, was the malefactor.

 

The Stockman Manifesto

David Stockman’s manifesto is a tour de force that commands our close attention. In particular, his debunking of the so-called “financial crisis” in 2008 should be required reading for every American. Unfortunately, his omnibus explanation of our fiscal and monetary woes explains too much – or not enough, depending on how you choose to express it. Perhaps Stockman’s own involvement in the Reagan Administration colored his analysis of Reagan’s policies. For whatever reason, his history of the Cold War is inexcusably short-sighted and unworthy of somebody whose views are otherwise acute.

DRI-465: The Road to Utopia

Utopia is the state of human perfection. Man has sought it for thousands of years. Theologians and prophets, wisely despairing of earthly success, preferred to locate it in the afterlife. Sir Thomas More’s take on the subject began a relocation process that picked up speed in the eighteenth century, when rationalist philosophy posited a human mind with the ability to comprehend itself and construct its own future. When human reason replaced God, politics supplanted religion as the road to utopia.

Of all the impediments blocking success of the utopia project, war has loomed largest. Throughout recorded history, war has been conducted between nation states. It seemed natural to equate nationalism with chauvinism, and many thinkers have done so. T. H. White’s famous parable had Merlin urging Arthur to see his land as if he (Arthur) were a bird in flight, to which boundary lines are invisible.

Modern political science has viewed national boundaries with ill-concealed distaste while mostly continuing to indulge its utopian leanings. The growth of the size and scope of government has made government policy the major outlet for the utopian urge. That urge has found its strongest expression in the obsession with ending war.

In the 20th century, the primary tools fashioned for this purpose were foreign policy and foreign aid. It is instructive to compare the effects of these political tools with those of the decidedly non-utopian science of economics, deployed to the same end.

World War I and Postscript.

The prevailing foreign policy prior to World War I revolved around preserving a military balance of power between nations. To that end, major powers like Great Britain, France, Germany and the Austro-Hungarian Empire interwove a fabric of military alliances with each other and lesser powers. This strategy allowed the assassination of an archduke in MIddle Europe to drag the world into war.

Blame for the decimation of European manhood in the trenches of France and Belgium was laid on the losers and on the institution of war itself. Pacifists and statesmen like Cordell Hull steered foreign policy into the opposite direction by negotiating treaties like the Kellogg-Briand pact, which supposedly outlawed war as an instrument of national policy.

The victorious Allied powers – Great Britain, France and the United States – were able to dictate terms to the losers – Germany and Austria-Hungary. French leader Clemenceau wanted to exact revenge and monetary remuneration for France’s defeat in 1871. Great Britain’s Lloyd-George wanted to offset Britain’s massive war debt. The duo demanded that German be forced to pay huge reparations to the Allies.

The Transfer Problem

Forcing World War I reparations on Germany was one of history’s truly bad ideas. At the time, though, almost nobody knew it. One of the few who did was John Maynard Keynes, one of the leading British economists. In 1919, he wrote one of the most influential books of the century, The Economic Consequences of the Peace. Today, Keynes is famous as the father of macroeconomics, but the truth is that he was not a brilliant or original theorist. His book relied on the theory of international trade developed by classical economists in the eighteenth and nineteenth centuries and refined by his mentor, Alfred Marshall.

The salient idea is that a financial transfer between nations must have a real counterpart – that is, a corresponding transfer of goods or (possibly) services. It simply cannot end the matter to say that “the German treasurer writes a check to the British/French treasury.” The check will be deposited in the government’s bank account and the bank will exchange the money for units of its own national currency. The foreign exchange dealer who makes the exchange does so in the expectation of being able to “sell” the German deutschmarks to people who want to buy German goods. Eventually, that happens, and English and French citizens buy German goods. In effect, the German government has given away real goods as punishment for its war-related sins.

Popular grasp of economics was no better then than now, and most people of the day probably had some vague notion that the reparations would be paid by “the German government” or even the Kaiser himself. Instead, they were paid by the population at large, whose living standards were reduced by the fact that they had fewer goods available at higher prices than would otherwise have been the case.

Keynes also claimed that Germany would suffer a secondary burden. In order to send goods abroad, its exports would have to rise relative to its imports, and this would mean that its terms of trade – its export prices compared to the prices it paid for imports – would deteriorate. This wasn’t necessarily true. But what is beyond debate is the fact that Germany suffered terribly under the reparations imposed by the Allies. And that suffering was worsened by the refusal of Britain to allow imports of goods competing with its own manufacturers, which forced more of the burden to be borne by German price changes.

The WeimarRepublic, Hyperinflation and Hitler

In an attempt to cope with falling prices and net monetary outflows owing to reparations, the German authorities under the WeimarRepublic printed vast quantities of money. This caused one of the worst hyperinflations of modern times.

The WeimarRepublic’s hyperinflation discombobulated the nation so badly that Adolf Hitler’s Fascist Party (modeled on Mussolini’s Italian version) found increasing favor with the German public. In 1933, Hitler was democratically elected to replace outgoing German Chancellor Paul von Hindenburg. Throughout that decade, Germany pursued a policy of rearmament and territorial expansion. Meanwhile, Britain and France maintained their anti-war policies by appeasing Hitler’s expansionist demands.

In sum, the post-World War I foreign policy of pacifism and economic punishment meted out to the citizens of “aggressor nations” was as disastrous as the pre-war policy of militarism has been. Economic disintegration merely produced more militarization and war.

Runup to World War II – the Roots of Japanese Aggression

The 20th century’s first major war was between Russia and Japan in 1904-05. Russia sought to consolidate its hold over Manchuria in order to enjoy year-round, warm-water trade out of Port Arthur. The Japanese coveted Manchuria and Korea as a buffer against territorial encroachment by the Tsar.

The result was a short but bloody war. It was ended by the intervention of U.S. President Theodore Roosevelt, whose Nobel Peace Prize did not reflect the disenchantment felt by both the principals. The Russians would be fighting externally within the next decade and succumbed to Communist revolution in 1917. Meanwhile, Japan was launched on a program of imperialist aggression that spread to China and the whole of Southeast Asia.

The Japanese finally met their match when they took on the United States in late 1941. Once again, U.S. foreign policy and trade both figured prominently. President Roosevelt and his advisors were desperate for an excuse to intervene militarily on England’s behalf against Nazi Germany. Despite Hitler’s blatant provocations, treaty violations and internal persecutions of Jews, Americans overwhelmingly opposed entry into the European war. Ethnic and trade ties to Germany were very strong in the U.S. In fact, several prominent American corporations continued to do business with the Third Reich even after hostilities eventually broke out.

The Roosevelt administration developed a plan to provoke an attack by the Japanese. The plan, written down in a memo to the President by Lt. Commander Arthur McCollum, recommended eight specific provocations. One of these was an embargo on oil exports to Japan by the U.S. The actions were adopted beginning in October 1940. Japan’s attack came fourteen months later.

Looking back at the interwar period, it is noteworthy that the suppression ofinternational trade with Germany and Japan led to desperation and war. But where trade was already established – between Nazi German and the U.S. – the outbreak of war was not powerful enough to halt it completely. This latter fact is customarily viewed as shameful. Yet if war is indeed the greatest scourge known to mankind, as Gen. MacArthur claimed in his speech at the Japanese surrender ceremony, shouldn’t we celebrate the fact that trade relationships can withstand a declaration of war? This does, after all, suggest that trade can reduce the threat or likelihood of war.

Japan and Germany After World War II

After World War II, both Japan and Germany were forbidden from raising standing armies. This left them free to devote their resources to peacetime pursuits; e.g., the production of non-military goods and services.

At first, the Germans were hamstrung by the fact that their production and transportation infrastructure had been devastated by Allied bombing and tactical operations. Just as devastating was the system of wage and price controls instituted by the Allied governors who oversaw the program of de-Nazification and gradual transition back to local control.

In 1948, German chancellor Ludwig Erhard defied the Allies and all of his own advisors except one – free-market economist Wilhelm Ropke. Overnight, Erhard abolished all price controls. What followed has become known as the “German Miracle.” Germany became the bellwether of the European Common Market, a customs union designed to promote international trade among its members and forerunner of today’s European Union.

The Japanese were similarly hampered by the new constitution designed by their Allied governor, Gen. Douglas MacArthur, who modeled it after Franklin Roosevelt’s New Deal. Gradually, however, they learned to cope with the fact that their institutions were not well-suited to free markets. They set out to study Western production methods, particularly the ideas of statistician W. Edward Deming, who became their quality-control guru. Starting in the 1950s, Japan became the international economic powerhouse of the Far East, building an economy heavily dependent on exports. They rose steadily to the position of the second-leading world economy.

It seems that both German and Japan became roaring successes and benefactors to the world almost as soon as they quit practicing foreign policy and began engaging in international trade on a large scale. Sometimes this is cited as a complaint, as if they were freeloaders who were benefitting from options not available to other countries. Others object that the two countries were forced to desist from military operations – but this had also been true of German between the World Wars. The difference was that the Versailles war reparations eviscerated their economy and delivered them into the hands of fascism, which re-militarized the country in defiance of the peace treaty.

The Foreign Aid Fiasco

The failure of foreign policy to usher in a peaceful utopia did not discourage proponents, who adopted another policy tool with the same goal. Foreign aid was designed as a kind of international income and wealth redistribution policy. By appropriating taxpayer funds in developed countries and handing them over to less-developed countries (LDCs), utopians were purportedly practicing “economic development,” the use of government policy to cause per-capita real income to grow regularly larger.

The middlemen in this crusade were primarily international lending agencies like the World Bank and the International Monetary Fund – both of which had originated with other mandates but whose bureaucratic chief executives knew a reliable source of funds when they saw one. The general idea was that the Bank and the IMF would seek out needy countries and administer loans to their governments – while carrying large staffs of university-trained economists and earning hefty fees, of course.

The foreign-aid gravy train ran like a railroad – but with even less success – for over fifty years. Observers like the late Lord Peter Bauer and William Easterly have meticulously chronicled the crashing failure of aid programs to promote economic development. Large-scale infrastructure programs did not mesh with their surrounding capital structure. Aid funds were siphoned off by corrupt dictators, their cronies and relatives. Little or no attention was paid to the conditions necessary for free markets to gestate and thrive. The biggest failures were those who, like India, received the most money and attention and seemingly possessed the raw material necessary for success.

That does not mean that there were no economic development successes during the 20th century – only that they weren’t achieved via foreign aid. In Southeast Asia, the “Asian Tigers” rose from the ashes of the Vietnam era to become regional powerhouses. They did it mostly by practicing free trade in free markets. Belatedly, India awoke from decades of post-colonial torpor by starting to free up domestic markets and beginning to export its own labor services. China achieved some of the fastest economic growth in history by largely renouncing Communist economics and wresting the title of world’s export powerhouse away from Japan.

The Pattern

The pattern that emerged over the course of the 20th century is shockingly clear. The tools of foreign policy and foreign aid were intended to foster world peace, by way of achieving prosperity and fairness. Instead, they encouraged and actually caused war, the very thing they were supposed to eradicate. But the most warlike countries became paragons of virtue – peaceful, prosperous and beneficent – when they practiced international trade. It was only when they were denied the benefits of trade that they turned to war.

Utopian attempts to minimize the meaning and importance of borders have failed miserably. Political borders exist because they serve the useful purposes of separating populations into harmonious groups and keeping governmental units small and manageable. Allowing free trade across political borders lets us have the cake of efficient political borders while eating the economic benefits of free trade. Those benefits are efficient production along lines of comparative advantage and optimal consumption resulting from voluntary exchange.

The beauty of this arrangement is that it does not require that people resolve irreconcilable differences of religion, politics or ethnic history. Bitter enemies can become staunch trading partners – thus minimizing the likelihood that their disagreements will lead to all-out war.

The Long (Capitalist) Peace

Does this sound like just another utopian dream? Don’t look now, but it has already happened. According to scholars from disciplines like political science, philosophy and psychology, we are in the midst of a Long Peace. As Harvard psychologist Steven Pinker points out in his recent book, The Better Angels of Our Nature: Why Violence Has Declined, we may be living in the most peaceful era in the history of mankind.

Wars between economically developed nations have dwindled to virtual non-existence. (Even counting the Korean War yields only one such war in the last 67 years.) Wars in general are at a historic low. Violence of all kinds has declined dramatically. Murder, rape, riots, and child abuse are down worldwide. Slavery and human sacrifice are effectively extinct. This phenomenon began to attract notice some three decades ago and is still holding good.

True, scholars have been slow to recognize the part played by free international trade. Political scientists have found political causes; psychologists see the answer in psychology. (Pinker himself claims to find changes in brain evolution that have permitted us to access “the better angels of our nature.”) But that is changing. Scholars now speak of the “Capitalist Peace” – an idea Pinker describes as “a shock to those who remember when capitalists were considered ‘merchants of death’ and ‘masters of war.'” One such scholar ended a presidential address to his academic organization with the phrase, “Make money, not war.”

This outcome would not have surprised the classical liberal economists of the nineteenth century (today’s libertarians). They maintained that free trade would eliminate the primary causes of war – first, by enabling nations to make the best use of the human and non-human resources possessed by all; second, by vitiating the need for warfare to gain access to trade routes or scarce raw materials or minerals; and third, by making it counterproductive to fight nations that supply consumption goods and vital production inputs.

Recently, the volume of world trade reached its all-time high. This fact was not celebrated by the press, which preferred to stress the political and military tensions involving countries like Afghanistan, Iran, Iraq, North Korea and the Sudan. The technological revolution has raised our fact-gathering capabilities to undreamt-of heights, letting us pinpoint the world’s discontents at a moment’s notice. Without the guidance of historical perspective, however, we might fail to realize that the discontents of our modern civilization are minute in number and miniscule in size and scope compared to those of the past.

And without the counsel of economic logic, we wouldn’t understand exactly what has made the Long Peace so peaceful. It wasn’t the astute foreign policy formulated by presidents and prime ministers and advisors. It certainly wasn’t the trillions of dollars of foreign aid lavished on less developed countries by the Western developed nations. It was trade – not aid – that did it.

World free trade is not a utopian concept because it is a process, not a defined state. Freedom is the absence of coercion, not the achievement of pre-set goals. The attainment of peace through trade is another case in which the best way to get what you want is not by aiming directly at it. This should not be surprising, because maximizing freedom leaves the most options open to us.