An Access Advertising EconBrief:
Recently, several think tanks such as American Enterprise Institute have quantified the degree of Americans’ dependence on government. Federal-government transfer payments have increased from less than $100 billion dollars in 1960 to well over $2 trillion dollars today. Even in real terms, adjust for inflation, transfer payments per capita have increased sevenfold. In 1983, around 30% of U.S. households contained at least one member receiving a subsidy check from the federal government. Today, the number is close to 50%. The fraction of individuals between ages 18 and 64 who were receiving Social Security disability payments in 1960 was about 0.04%. By 2010, the percentage on disability had risen to about 4.6%. (This coincides with the time period in which government agencies charged with policing workplace safety were created.)
A book by AEI’s Nicholas Eberstadt summarizing our dependence on government summarizing our dependence on government is entitled A Nation of Takers. That conjures up the picture of a government-run gravy train on which an army of citizens queues up to hitch a ride, like hobos gathering just outside town at dusk. Transform that picture into a painting and its title would be: Subsidy Nation.
The Roots of Subsidy Nation
The roots of subsidy nation were planted in the 19th century with Henry Clay’s American System. Clay built a political coalition that offered American businesses protection from foreign competition in the form of tariffs – taxes levied on imported foreign goods. The owners and employees of import-competing domestic businesses gained from these tariffs. Everybody else lost, even allowing for the fact that the taxes were the primary basis for federal-government revenue prior to 1913. The American System died with Clay but the tariffs remain. Their height waxed with the Fordney-McCumber and Smoot-Hawley bills of 1922 and 1930, respectively, and waned with subsequent multilateral bureaucratic efforts to free up international trade through the General Agreement on Trade and Tariffs (GATT). But when the chips were down – e.g., when re-election was at stake – politicians could be relied upon to sacrifice the general interest of consumers and import-dependent producers to the special interests of import-competing producers.
American businessmen learned a valuable lesson from U.S. commercial policy. An ounce of protection purchased from government is worth a pound of competitive zeal. Subsidy Nation was born.
Over the course of American history, its farmers have made us the world’s breadbasket. Along the way, they have had to overcome the twin handicaps of price-inelastic and income-inelastic demand for most of their products. That is, when the prices of food and fiber decline and real incomes rise, people do not increase their purchases proportionally. After all, you can only eat so much or wear so many clothes. And falling prices were the norm throughout the 19th and 20th centuries, thanks to continuous improvements in technology and productivity and the resulting increases in agricultural supply.
But American farmers are tough. They know that when the going gets tough, the tough get going – to the government for handouts. Activist farmers carefully picked out a year when agricultural prices were high, then enshrined that year’s prices as their desideratum. They demanded “parity” – farm prices commensurate with those in the good old days. Eventually, after a few decades of hectoring by populist legislators, the New Deal acquiesced with agricultural subsidies.
Federal-government farm-subsidy programs have used techniques like price supports, which prop up prices artificially high using government purchases, creating huge surpluses and wasteful storage costs – not to mention the waste of resources in producing more agricultural goods that are desired in the first place. Target-price programs at least got rid of the surpluses by allowing market prices to fall until the surpluses were disposed of, while paying farmers a direct subsidy to make up the difference. But farmers were embarrassed at getting welfare checks instead of price-support checks, so taxpayers got stuck with the storage costs after all when target prices got shot down. Acreage allotments paid farmers not to farm part of their land. This produced a crop of undesirable consequences, ranging from overfarming and overfertilization of the smaller allotments to monopoly rents accruing to owners of the subsidized acreage.
The joke turned out to be on farmers in the end. Technology gradually turned agriculture into big business by requiring sizable capital investment for machinery and expertise (both scientific and financial). Large corporations are designed for the express purpose of raising large amounts of capital. Not surprising, agribusiness took over most of agriculture. Subsidy programs rewarded farmers according to output; therefore, most of the farm subsidies went to owners of the large corporations. The political left wing, who had called the loudest for government involvement to protect the small “family farmer,” now screamed bloody murder when the beneficiaries turned out to be Archer Daniels Midland instead of Okies from the Dust Bowl.
Subsidy Nation was up and running.
The concept of “social insurance” dates back at least to the 18th century, but it found concrete expression in late 19th century Germany. Chancellor Otto von Bismarck had his hands full coping with the socialist movement that had swept over Europe beginning in 1848. To consolidate his power by appeasing the opposition to his left, he agreed to a system of old-age, sickness and accident benefits, funded and administered by the state. By the 1930s, proponents of Social Security had convenient forgotten its origins and remembered only its intentions; namely, to end poverty and neglect suffered by the elderly.
The Roosevelt Administration sold its proposal to the American public by trading heavily on the word “insurance.” Social Security would take in “contributions” from citizens and “invest” them in “special trust funds” where they would be “pooled” for future need. In other words, it would operate much like private insurance, except it would have no need to earn profits and would consequently behave in a reasonable, compassionate manner toward its “beneficiaries.”
This always was, and remains, pure malarkey. Like all so-called “social insurance” systems of its day and until the last two decades, the U.S. Social Security system began and remains a “pay as you go” system in which current benefits are paid from taxes levied on contemporaneous workers and employers. (In the true economic sense, the full incidence of the tax falls on the worker even though half the tax is nominally paid by the employer; the worker pays the other half in lower wages.) The tax is a regressive tax levied on earnings up to a maximum. Thus, it transfers income not merely from young to old but from poor to rich(er). Total benefits have virtually no connection with amounts paid in. The first Social Security recipient began paying in 1937, receiving in 1939, lived to age 100 in 1974, paid in $24.75 and received $22,888.92. Today, most new recipients will not receive even what they pay in, let alone a reasonable rate of return on their “investment.”
For years, Social Security was pointed out as the crown jewel of the welfare state. The baby boom created huge numbers of payees relative to recipients, masking the inherent unsoundness of the system. Today, the reverse is true. Recent falling birth rates throughout the Western world have combined with increasing life expectancies at age 65 to produce an actuarial nightmare – the unfunded liabilities of social insurance systems are off the charts, dwarfing even those of sovereign debt. Politically powerful senior citizens’ groups like AARP agitate against reforms to the system and demand the return of “their” money, oblivious to the facts that it is long gone and that the trust funds are mere accounting fiction.
When private citizens save for their own retirement, their savings are invested in productive assets. Of course, not every investment is successful, but on net balance, American business is productive. After all, the increase in productivity from year to year is what enables more goods and services to be produced and consumed; that is what has constituted America’s rising standard of living. When government takes our Social Security contributions in lieu of allowing us to save, that money is spent to pay current benefit recipients and the productive investment that would otherwise occur is never made.
With the passage of Social Security, America had crossed an invisible divide. Subsidy Nation had hit the big time.
Welfare – But Whose?
In the mid-1940s, the federal government began subsidizing the lunches of public schoolchildren. The program was originally designed to do two things: help get rid of surplus foodstuffs piled up by federal farm subsidies and improve the nutrition of poor schoolchildren. Over the years, the program grew in size and scope. It expanded to include breakfasts as well as lunches – of course, this required children to arrive at school earlier. Recently, dinners have showed up on the menu. And what started out as a school-lunch program for the poor has now become an all-purpose program of nourishment for public schoolchildren.
As a means of improving welfare, it suffers all the defects of government programs. The program has expanded pari passu with an increase in childhood obesity and incipient diabetes. We now know that government-imposed dietary standards are responsible for some large measure of this; today’s new nutritional learning has stood yesterday’s virtually on its head. The subsidized prices students pay (or not) distort the choices made; encouraging food consumption via artificially low prices is not the way to deal with obesity. The distortion affects the supply side of the market, too, by reducing the incentive to craft desirable menus; yesterday’s wasted rotting food surpluses are today’s uneaten food shoveled down the drain or into trash bins.
Emboldened by its rip-roaring success with school children, the federal government broadened its food-subsidy programs to include poor people generally in the early 1960s. It began issuing stamps for use as vouchers in purchasing food, with each stamp good for a value of food purchases. Again, the program began with limited purposes – to insure a minimum amount of nutrition for citizens with incomes below the poverty line. Again, the program grew like Topsy. Today, some 47.7 million people – nearly 1 out of every 7 Americans – receive food stamps.
Was this stupendous growth owing to the brilliant success of the program? Hardly. The food-stamp program is such a notorious example of government subsidy gone wrong that it is featured in economics textbooks as a case study in what not to do when trying to help people. Over the years, food stamps have usually traded at a discount for cash in the black market – sometimes at rates of 50 cents per dollar of nominal value. The words “waste, fraud and abuse” should appear in dictionaries alongside the entry for “food stamps.”
Both school lunch and food-stamp programs run afoul of the general economic principle that subsidies in cash are generally preferred to subsidies in kind. People can use the cash in any way they prefer but are limited to particular uses for school lunch or food-stamp subsidies. This raises the highly pertinent question: Whose welfare are welfare programs supposed to increase – that of recipients or taxpayers? Given the tremendous waste inherent in both programs, that may seem a dumb question. Yet taxpayers display stubborn resistance to reform of these programs, typically based on the presumption that cash subsidies would be wrongly used by the poor and needy – who, unlike taxpayers, are presumably too stupid to be able to judge their own best interests.
At a moment in history when Western governments are staggering under the burden of overwhelming debt, it seems germane to point out that cash subsidies sufficient to life every man, woman and child in the U.S. above the poverty line would amount to much less money than is currently spent on “welfare” programs of all types. How much less? Over the decades, back-of-the-envelope estimates have ranged anywhere from two to ten times less.
In other words, these subsidies are staggeringly inefficient.
Medicare and Medicaid
In the 1960s, President Lyndon Johnson’s Great Society observed that there was evidently political capital in creating benefits for the elderly. Even more importantly, the fact that these benefits were financed by taxes and cost more than the value they created did not seem to be generally recognized. Nor did it affect their political popularity. Johnson’s advisors rubbed their hands and set to work creating a system of government medical care for the elderly and the indigent.
On the surface, Medicare may seem different than government-owned and operated systems like Great Britain’s National Health Service. The difference derives almost completely from the fact that Medicare is administered by large private contractees, such as Blue Cross and Blue Shield and hospitals. That does make at least one important difference on the supply side of the market – the presence of profit throughout the system means that there is an incentive to make and maintain capital investments in medical technology and pharmaceutical products. In Great Britain, by contrast, the lack of state-of-the-art equipment is a scandal.
But as the consumer experiences them, there is little to choose between the U.S. system and government systems. The overriding similarity is the rule of the third-party payment, in which the consumer chooses treatments but the government/insurance company pays the bills. Thus, the consumer has almost no incentive to economize or choose wisely and resources are wasted hand over fist. Walk into a U.S. emergency room – the context in which high prices would and should place the highest premium on careful choices by consumers – and chances are that hospital staff will refuse to quote a price for any particular service, at most providing a flat rate for provision of service. The consent to treatment form that the patient is obligated to sign is either a blank check written on the insurance company or (for the uninsured) a farewell note to his or her net worth.
Even worse is the effect on the demand for medical care. When somebody else is paying the bill, consumers react as if the price of medical services were zero. Demand zooms into the stratosphere. Proponents of government-run health care pretend that medical care is an absolute necessity, but only a tiny fraction of it pertains to immediate threats to life. The need for a working price system in health care is urgent.
All this is bad enough, but Medicare and Medicaid compound their basic felonies with Byzantine regulations that add complication in the name of saving money without reducing true economic costs. Economic cost is the value of alternatives in production and consumption, as reflected in market prices. Since Medicare and Medicaid suppress market prices, their supposed “cost savings” are bogus. Both bureaucrats and government contractees lack the information necessary to centrally plan the medical care of millions of individuals. Arbitrarily reducing the fees of doctors is not cost savings; it merely reduces the level of care and substitutes poor service for higher prices.
Subsidy Nation had achieved another milestone: entrapping the elderly in an inferior system of subsidized medical care with no escape route.
Environmentalism, Alternative Energy and Streetcars
The publication of Rachel Carson’s Silent Spring in 1962 marked the watershed when the movement for conservation of natural resources – a goal with some basis in logic – changed course into a secular religion called environmentalism. The latter term has no logical underpinnings since it offers no grounds for favoring one aspect of “the environment” over another. Clean air is a good thing, right enough – but how clean does it need to be? And whose idea of “clean” gets imposed on everybody else? Ditto for water, land and the rest of “the environment.” The very essence of free markets is to provide an efficient answer to questions like that, while in Subsidy Nation those questions not only go unanswered, but even unasked.
Insofar as environmentalism has anything one could call “principles,” it believes that there is some sort of absolute standard for damage to the holistic, personified entity called Mother Nature, and that the resources of nature cannot be exploited without violating that standard. This provides the implicit justification for subsidizing technologies like solar and wind and fuels like ethanol. Without subsidies, these would vanish from sight due to their unproductiveness. Indeed, the subsidies must be given on both sides of the market – business-firm subsidies to stimulate production and consumer subsidies to stimulate consumption. This is the sine qua non of Subsidy Nation: government at every level has to play Dr. Frankenstein by artificially animating the entire market.
Environmental hysteria reaches its apogee with the recent fad for streetcars. In an age when science has given us the power to ride in computer-guided, driverless cars, trucks and planes – thus vastly enhancing safety, increasing speed and boosting human productivity – we are going to ride around in streetcars? Voluntarily? These are so monumentally inefficient and ineffectual that they require massive subsidies funded by taxes enacted by stealth. This is Subsidy Nation run amok.
Over the last quarter century, casual observers of state and local business blinked in amazement at the rise to prominence of “economic development.” Every state in the nation established a department, bureau or corporation of economic development. “Economic development incentives” became the order of the day. After a couple centuries of folding, spindling and mutilating economic principles and logic, at last local politicos were finally getting the message. Surely prosperity would follow closely in the wake of this phenomenon.
Seasoned observers knew better. They knew that the discipline of economics contained a field of specialization called “economic development,” pioneered by one of the most famous economists of all time, Joseph Schumpeter. They had remarked the curiosity that – like the dog that didn’t bark in the nighttime – state departments of economic development seldom employed actual economists and almost never spoke a word of genuine economic development theory.
The reality of economic development after World War II in the undeveloped Third World – Africa, most of Latin America and much of Asia – foundered on the lack of effective markets. A major roadblock was the stifling effect of crony capitalism – the preemption of investment by friends, relatives and associates of those in power. This is the kind of “economic development” being preached and practiced at the state and local level today. It may have been best described by the wags in Arkansas who remarked drily that, while Governor, Bill Clinton was bent on achieving economic development one business at a time – starting with his friends.
EDIs take various forms, but they all involve artificial encouragement of business and particularly of investment. The operative meaning of “artificial” is defined in two ways: by selectivity and by distortion. Selectivity implies the granting of favors and making of distinctions for one or a few, but not for any or all. The process known at “tax increment finance” is a classic example. It allows recipient businesses to get favored tax treatment on their business investment and it is awarded by a commission, not available to all on equal terms. Naturally, the commission is set up precisely to award TIF status to those who enjoy the favor of and/or play ball with the local political establishment. Also naturally, the pretense is made that TIF follows sound, established economic principles. Accordingly, awards and publicity are well larded with jargon terms and blue sky prospectuses. Distortion involves the waiving or modifying of normal outcomes and procedures, such as market prices, taxes and government rules and processes.
It takes a professional to see all the way through the economic development charade. For example, many well-meaning amateurs comment approvingly about the “competition between state and local governments for business” that gives rise to EDIs, and compare it with states that lower corporate and individual income tax rates to attract business and residential inhabitants. In the first place, tax reductions for everybody do not distort the relative merits of individual investments the way that EDIs do. Secondly, taxes discourage economic activity by distorting prospective returns and incentives – hence reducing taxes is eliminating a distortion when the reductions apply across the board. EDIs create distortions; by making one investment appear misleadingly attractive, they are making another one misleadingly less attractive.
But the professionals aren’t fooled. Organizations as diverse as the Minneapolis Federal Reserve and the United Nations have commented unfavorably upon EDIs. Economists across the political spectrum have condemned them. The problem here is that the only time public attention can be distracted by economists is when the debate focuses on unemployment, inflation, “creating jobs” or predicting interest rates. Since these are things economists don’t do well, the well is poisoned for discussion of genuine economics.
Going Down for the Third Time
Today, the U.S. is drowning in a sea of subsidies. Most Americans have grown up being subsidized by the federal government. They have watched their friends, neighbors and enemies being subsidized with their tax dollars. They have come to view the political process exclusively as a zero-sum game, a fight in which the majority gets to use its absolute power to take the minority’s money for its own use. “Rights” simply define the ways and means of effecting the seizure. Economic growth, if it is pondered at all, is viewed not in the aggregate but rather as their own, personal, cost-of-living increase, decreed from above by somebody in authority. Apparently, the source of all that bounty that comprises the American way of life is a mystery to them.
The truth is grim. Throughout the 20th century, real income rose in the U.S. But this aggregate outcome concealed an underlying struggle between two forces pulling real income in opposite directions. Technology was advancing, increasing productivity and driving real income up. Meanwhile, each new round of subsidies was reducing economic efficiency, driving real income down. The net result, fairly steady increases in real income, reflected the fact that science, technology and a brain-inflow from the rest of the world was enough to pay the subsidy bills with a little left over to grow on. But that long holiday is now over. The end of the baby boom and the death knell of Reaganomics sent economic growth on a downhill slide.
And we aren’t alone. A good part of the world preceded us, even outdid us, in the transition to Subsidy Nation. Their downfall is the preview of our coming detractions. They are already underwater. We are going down for the third time. Like them, we are going not with a bang, but with a whimper. Crying for political compromise. Moaning for our entitlements. Whining for our subsidies.