Born into a world without indoor plumbing, airplanes, refrigeration or even polyester (the poor bastards), our great-grandparents witnessed quite a technological leap during their lives, beginning in an era when the use of automobiles had not yet become widespread.
By the time they started collecting Social Security (another innovation), many of our great-grandparents’ homes had been air conditioned. The annihilation of Polio had been realized, and their televisions, colorized. It was a new world of skyscrapers, Apollo spacecraft and Concorde jet planes. The year 2000, surely, would bring along high-speed and hands-free hover cars. A week at an undersea resort would make for a modest vacation.
The unprecedented pace of modernization observed through the first three-fourths of the 20th century kept those type of projections within the scope of (eccentrically optimistic) reason. Of course, reality has not been as kind. With the exception of financial innovations and information technology, society’s rate of technological advance after the mid-1970s has slowed down to a chilling joggle.
Exemplified by NASA’s slow descent into irrelevance, the Space Age rings like something out of a science fiction novel. Our peak of physical travel was reached by the now-decommissioned Concorde and in spite of billions in funding, we can’t seem to find an efficient alternative to the fossil fuels upon which we’ve been dependent for decades.
Those who keep up with my work know that I’m rather fond of the Great Stagnation hypothesis put forth in recent years by Tyler Cowen and Peter Thiel. To very briefly sum it up, the two contend that we have plateaued technologically outside of a few select sectors, due to the fact that we have run out of the “low hanging fruit” that spawned previous innovations, coupled with badly-written, often excessive regulation.
One of the points Cowen touches upon in The Great Stagnation is the diminishing marginal utility of public and private sector spending. I’m going to focus on this matter specifically, because I believe it to be a critically underrated factor behind our present economic state. Not taking its effect into account deludes us into believing we are richer and more technologically advanced than we actually are. For businesses and governments alike, these blinders can have devastating consequences. Lastly, the concept of declining marginal utility in our institutions is neither endemic to the left or right, nor hostage to any particular ideology.
Government spending is simply not nearly as valuable as it was decades ago. Our homes have been electrified, our rivers forded by bridges and our cities connected by interstate highways.
Indeed, it’s easier to teach one arithmetic than it is to teach them calculus. Doubling our miles of interstate highways or extending a comprehensive passenger rail network outside of the I-95 corridor is impractical. Whether it’s educational achievement, infrastructural improvements or the space program, dollars simply don’t go as far as they once did.
As if that weren’t bad enough, the government has embarked on new and doubtlessly more questionable methods of spending. As Cowen notes, the post-9/11 build-up in military and domestic security is meant to neutralize terrorist threats, but there isn’t a particularly good way of measuring its efficacy. Not to mention that it doesn’t produce anything that raises our general standard of living.
Assuming you live in a city of 100,000 people and $100 million is allocated from the city coffers to build a highway through town, it’s possible all works out and the highway is worth exactly $1000 to you. Maybe you would pay up to $3000 to use it for x years, in which case you’re getting a bargain. But if you prefer to bike or never leave town, it’s not worth much at all. Because the public sector values its creations at cost, instead of price, we really have no idea if the government should spend more or less on y or z.
The private sector is hardly immune to decreasing marginal utility. Take the billions pharmaceutical companies continue to pour into research and development without much to show in life expectancy gains as a prime example. But because the private sector values goods and services at price, rather than cost, we can figure out whether to spend more or less on widgets.
When you buy an apple priced at $1, it’s worth at least $1 to you at that time and place. Otherwise, you likely wouldn’t buy it. The same rule holds true for automobiles, computers and Snuggies. Of course, even the freest of markets will not be Pareto efficient, and monopolies in certain sectors may arise, but for the time being markets are the best we’ve got. They still provide better information to more people about public or private goods than any government can.
In the innovation file, perhaps the best thing we can say about post-1980 Western governments are that their costly wars on gas-guzzlers, obesity and smoking may grant us a few extra years of average life expectancy at the end of our retirement. But is government spawning much innovation outside of encouraging spending on diets, protein bars and nicotine patches? Is government incentivizing any economic production among those retirees, or creating any tangible wealth for that matter? It certainly doesn’t take many more people to manufacture a hybrid, cook healthy or roll less cigarettes.
Because dollars haven’t gone as far in recent decades as they once did, we require more dollars than ever before to achieve similar outcomes — in this case, a rate of innovation comparable to that of our great-grandparents’.
The problem is that having more dollars does not necessarily incentivize a level interest in pushing technological advance across all sectors of the economy. This is generally associated with the private sector. As a result, the effect of diminishing marginal utility in both the public and private spheres has all but terminated the existence of the working or middle-class entrepreneur.
Prior to the turn south in marginal utility, a working or middle-class individual with decent credit could obtain a loan, and spend their spare time working on gadgets and gizmos to better some aspect of their life. More significantly, for a long time, increases in government spending meant increases in standard of living. But that age is long gone.
With the exception of Internet-based entrepreneurship — a culturally powerful, but thus far an underwhelming domain economically — the prospect of innovation almost exclusively dwells in the hands of the very well-off.
It’s not that I believe the rich man or woman to be inherently evil, but rather, no less self-interested than any other person. The disappearance of the middle-class in innovation and effect of declining marginal utility, particularly in the public sector, only amplify the nature of their actions. Vast amounts of money must be allocated to get most public and private projects off the ground these days. As long as the trend of diminishing marginal utility and rising price of innovation continues, the rich will be much more selective in the projects they choose to invest in. They’ll take less risks, especially in broad, open-ended projects where the guarantee of their own self-interest is uncertain.
That’s partially why we haven’t seen a complete stop in innovation to-date. The breakthroughs in the financial sector from the 1970s to the Great Recession were incredible. The development and trade of complex financial instruments like derivatives and asset-backed securities prospered for a long time because it worked — it created wealth, if only for a tiny minority of Americans.
Of course, innovation for the very rich and others in society is not mutually exclusive. The one other industry where we find ourselves leaps and bounds ahead of where we were in 1970 is information technology, specifically telecom and the Internet. Almost all Americans have benefited from these advances. The improvements made in communication and sharing of ideas and creations indicates that the future of the information sector is very promising — if we can keep the Internet relatively free from the middling hands of bureaucrats. Nonetheless, the years ahead look grim if we’re solely relying on the Internet for wealth creation, as I wrote about in an Oct. 4, 2011 piece.
As for government, it can take even more risks than most of the wealthy. But if the consistently inept central body has nothing to show for spending trillions in taxpayer dollars, and the bureaucracy continues to grow — if only to protect its own relevance — how is that not a form of plunder?
Unfortunately, tight wallets when investing in risky public goods neglect many important areas in American society. Let’s look at food. When you’re wealthy, you can afford to purchase high-quality ingredients, hire a cook or eat out when you please. That’s not to say food trucks and “fresh and fast” outlets like Chipotle have not been welcome arrivals, but our country’s obesity problem highlights a general lack of interest or impetus toward modifying our diet for the better.
The wealthy can afford to send their children to private elementary and secondary schools. Their connections and legacy status at America’s elite universities often allow their children admission as well. And when they get sick, the rich can count on affording treatment at one of America’s world-class hospitals.
These stand among the primary reasons why the appearances of most modern-day kitchens, public schools and hospitals are not too far removed from their 1972 counterparts.
I don’t think I’m being that overly simplistic here. As long as there doesn’t exist a need for the increasingly homogeneous class of creators to improve their access to good food, education or health care, what incentives do they have to ameliorate conditions for masses of people they have little-to-nothing in common with?
In an Apr. 23 piece in Slate, Timothy Noah writes, “in 1979 the top 1 percent consumed about 10 percent of the nation’s collective income. In 2010 it consumed about 20 percent.”
The reason why that statistic is worrisome is not due to an anti-rich bias, or my moral desire to see progressive taxation or an “equal” income distribution. The reason why’s it’s worrisome is that as long as a small minority happens to attain a larger share of society’s wealth, while the marginal utility of innovation spending decreases, their foothold over the entrepreneurial class they represent grows stronger. With that control comes a direct interest in what to, and what not to, invest in. As long as spending and societal advance are directly affected by diminishing marginal utility, the self-interested inertia of the very well-off will only pull the direction of innovation toward what the wealthy want to see. When a small group of like-minded people get to centrally decide what direction society will go in, the odds of rising from our present technological perch appear to be far from promising.
The result: the potential for a class divide that is insurmountable.