Home > Razvoj družbe > Rutger Bregman: Utopia for Realists

Rutger Bregman: Utopia for Realists

The Return of Utopia

Welcome, in other words, to the Land of Plenty. To the good life, where almost everyone is rich, safe, and healthy. Where there’s only one thing we lack: a reason to get out of bed in the morning. Because, after all, you can’t really improve on paradise.

But the real crisis of our times, of my generation, is not that we don’t have it good, or even that we might be worse off later on. No, the real crisis is that we can’t come up with anything better.

Why have we been working harder and harder since the 1980s despite being richer than ever? Why are millions of people still living in poverty when we are more than rich enough to put an end to it once and for all? And why is more than 60 % of your income dependent on the country where you just happen to have been born?

Utopias offer no ready-made answers, let alone solutions. But they do ask the right questions.

Whether it’s the growth of the economy, audience shares, publications – slowly but surely, quality is being replaced by quantity.

And driving it all is a force sometimes called “liberalism”, an ideology that has been all but hollowed out.

Meanwhile, the welfare state has increasingly shifted its focus from the causes of our discontent to the symptoms.

The food industry supplies us with cheap garbage loaded with salt, sugar, and fat, putting us on the fast track to the doctor and dietitian. Advancing technologies are laying waste to ever more jobs, sending us back again to the job coach. And the ad industry encourages us to spend money we don’t have on junk we don’t need in order to impress people we can’t stand.

That’s the dystopia we are living in today. According to the World Health Organization, depression has even become the biggest health problem among teens and will be the number-one cause of illness worldwide by 2030.

It is capitalism that opened the gates to the Land of Plenty, but capitalism alone cannot sustain it. Progress has become synonymous with economic prosperity, but the twenty-first century will challenge us to find other ways of boosting our quality of life.

The word utopia means both “good place” and “no place”. What we need are alternative horizons that spark the imagination. And I do mean horizons in the plural; conflicting utopias are the lifeblood of democracy, after all.

Without utopia, we are lost. Not that the present is bad; on the contrary. However, it is bleak, if we have no hope of anything better.

Why We Should Give Free Money to Everyone

Studies from all over the world offer proof positive: Free money works.

A research has correlated unconditional cash disbursements with reductions in crime, child mortality, malnutrition, teenage pregnancy, and truancy, and with improved school performance, economic growth, and gender equality.

“Poverty is fundamentally about a lack of cash. It’s not about stupidity”, stresses the economist, Joseph Hanlon. “You can’t pull yourself up by your bootstraps if you have no boots.”[1]

Free money: It’s a notion already proposed by some of history’s leading thinkers. Thomas More dreamed about it in his book Utopia in 1516.

And Article 25 of the Universal Declaration of Human Rights (1948) promises that, one day, it will come. A universal basic income. Basic income: It’s an idea whose time has come.

In the revolutionary year of 1968, five famous economists – John Kenneth Galbraith, Harold Watts, James Tobin, Paul Samuelson, and Robert Lampman – wrote an open letter to Congress. “The country will not have met its responsibility until everyone in the nation is assured an income no less than the officially recognized definition of poverty,” they said in an article published on the front page of the New York Times. The letter was signed by 1,200 fellow economists.

The following August, President Nixon presented a bill providing for a modest basic income, calling it “the most significant piece of social legislation in our nation’s history.” After months of being batted back and forth between the Senate and the White House, the bill was finally canned.

Not so very long ago , democracy still seemed a glorious utopia. Many a great mind, from the philosopher Plato (427 – 347 B.C.) to the statesman Edmund Burke (1729 – 97), warned that democracy was futile.

Compare this with the arguments against basic income. It’s supposedly futile because we can’t pay for it, dangerous because people would quit working, and perverse because ultimately a minority would end up having to toil harder to support the majority.

Futile? For the first time in history, we are actually rich enough to finance a sizable basic income.

Dangerous? Certainly, some people may opt to work less, but then that’s precisely the point.

One of the perks of a basic income is that it would free the poor from the welfare trap and spur them to seek a paid job with true opportunities for growth and advancement. Since basic income is unconditional, and will not be taken away or reduced in the event of gainful employment, their circumstances can only improve. Perverse? On the contrary, it is the welfare system that has devolved into a perverse behemoth of control and humiliation.

We’re saddled with a welfare state from a bygone era when the breadwinners were still mostly men and people spent their whole lives working at the same company.

In recent decades the middle class has retained its spending power by borrowing itself into ever-deeper debt. But this model isn’t viable, as we now know.

We, the inhabitants of the Land of Plenty, are rich thanks to the institutions, the knowledge, and the social capital amassed for us by our forebears. This wealth belongs to us all. And a basic income allows all of us to share it.

Utopias always start out small, with experiments that ever so slowly change the world.

The End of Poverty

A world without poverty – it might be the oldest utopia around. But anybody who takes this dream seriously must inevitably face a few tough questions.

The poor borrow more, save less, smoke more, exercise less, drink more, and eat less healthfully.

British Prime Minister Margaret Thatcher once called poverty a “personality defect”.

Eldar Shafir, a psychologist at Princeton University. He and Sendhil Mullainathan, an economist at Harvard, recently published a revolutionary new theory on poverty. He wants nothing less than to establish a whole new field of science: the science of scarcity. Scarcity impinges on your mind. People behave differently when they perceive a thing to be scarce.

People who experience a sense of scarcity are good at managing their short-term problems.

Scarcity narrows your focus to your immediate lack, to the meeting that’s starting in five minutes or the bills that need to be paid tomorrow. The long-term perspective goes out the window. “Scarcity consumes you,” Shafir explains. “You’re less able to focus on other things that are also important to you.”

There’s a key distinction though between people with busy lives and those living in poverty: You can’t take a break from poverty.

In the U.S., where more than one in five children grow up poor, countless studies have already shown that anti-poverty measures actually work as a cost-cutting instrument.

Shafir and Mullainathan have a few possible solutions up their sleeves: giving needy students a hand with all that financial-aid paperwork, for instance, or providing pill boxes that light up to remind people to take their meds. This type of solution is called a “nudge.” Nudges are hugely popular with politicians in our modern Land of Plenty, mostly because they cost next to nothing.

Up to a per capita GDP of roughly $ 5,000 a year, life expectancy increases more or less automatically. But once there’s enough food on the table, a roof that doesn’t leak, and clean running water to drink, economic growth is no longer a guarantor of welfare. From that point on, equality is a much more accurate predictor.

Because it’s all about relative poverty. However wealthy a country gets; inequality always rains on the parade. Being poor in a rich country is a whole different story to being poor a couple centuries ago, when almost everybody, everywhere was a pauper.

The society can’t function without some degree of inequality. There still need to be incentives to work, to endeavor, and to excel, and money is a very effective stimulus.

Those who cannot remember the past are condemned to repeat it. George Santayana (1863 – 1952)[2]

The Bizarre Tale of President Nixon and His Basic Income Bill

In the past, pretty much everything was worse. But with the world now changing faster than ever, the past seems more remote from us, too.

The past teaches us a simple but crucial lesson: Things could be different.

Richard Nixon was not the most likely candidate to pursue Thomas More’s old utopian dream, but then history sometimes has a strange sense of humor.

It would have been a massive step forward in the War on Poverty, guaranteeing a family of four $ 1,600 a year, equivalent to roughly $ 10,000 in 2016.

Martin Anderson was an advisor to the president and vehemently opposed to the plan. On the same day that Nixon intended to go public with his plan, Anderson handed him a briefing. Over the weeks that followed, this six-page document, a case report about something that had happened in England 150 years before, did the unthinkable: It completely changed Nixon’s mind, and, in the process, changed the course of history.

The report was titled “A Short History of a ‘Family Security System’” and consisted almost entirely of excerpts from sociologist Karl Polanyi’s classic book The Great Transformation (1944). In the seventh chapter, Polanyi describes one of the world’s first welfare systems, known as the Speenhamland system, in early nineteenth-century England. This system bore a suspiciously close resemblance to a basic income. Polanyi’s judgment of the system was devastating.

Two of Nixon’s leading advisors, the sociologist and later Senator Daniel Moynihan and the economist Milton Friedman, argued that the right to an income already existed.

According to Friedman, poverty simply meant you were strapped for cash. Nothing more, nothing less.

During the reign of Queen Elizabeth I (1558 – 1603), the Poor Law had introduced two forms of assistance – one for the deserving poor (the elderly, children, and disabled) and another for those who had to be forced to work.

Speenhamland was the textbook example of a government program that had, with the best of intentions, paved the road to hell. More recent research has revealed that the Speenhamland system was actually a success. Malthus was wrong about the population explosion, which was attributable chiefly to growing demand for child labor. At the time, children were like walking piggy banks, their earnings a kind of pension plan for parents. In 1834, the Speenhamland system was permanently dismantled.

The new Poor Law introduced perhaps the most heinous form of “public assistance” that the world has ever witnessed. Believing workhouses to be the only effective remedy against sloth and depravity, the Royal Commission forced the poor into senseless slave labor, from breaking stones to walking on treadmills.

Meanwhile, the myth of Speenhamland played a pivotal role in propagating the idea of a free, self-regulating market.

In 1996 the Democratic president Bill Clinton finally pulled the plug on “the welfare state as we know it.” For the first time since the passage of the Social Security Act in 1935, assistance for the poor was again seen as a favor instead of a right.

At Princeton University, the historian Brian Steensland has meticulously traced the rise and fall of basic income in the U.S., and he emphasizes that, had Nixon’s plan gone ahead, the ramifications would have been huge.

Capitalist or communist, it all boils down to a pointless distinction between two types of poor, and to a major misconception that we almost managed to dispel some forty years ago – the fallacy that a life without poverty is a privilege you have to work for, rather than a right we all deserve.

New Figures for a New Era

In 1850, the philosopher Frédéric Bastiat penned an essay titled “Ce qu’on voit et ce qu’on ne voit pas,” which means roughly “What you see and what you don’t.”

That is precisely what modern society’s sacred measure of progress, the Gross Domestic Product, does not measure. Ce qu’on ne voit pas.

The GDP is the sum of all goods and services that a country produces, corrected for seasonal fluctuations, inflation, and perhaps purchasing power. Besides being blind to lots of good things, the GDP also benefits from all manner of human suffering. If you were the GDP, your ideal citizen would be a compulsive gambler with cancer who’s going through a drawn-out divorce that he copes with by popping fistfuls of Prozac and going berserk on Black Friday. The GDP is equally indifferent to inequality, which is on the rise in most developed countries, and to debts, which make living on credit a tempting option.

As the Nobel laureate James Tobin said back in 1984, “We are throwing more and more of our resources, including the cream of our youth, into financial activities remote from the production of goods and services, into activities that generate high private rewards disproportionate to their social productivity.”[3]

“There is only one class in the community that thinks more about money than the rich,” said Oscar Wilde, “and that is the poor.”

The idea that the GDP still serves as an accurate gauge of social welfare is one of the most widespread myths of our times.

In 1665, the Englishman William Petty was the first to present an estimate of what he termed the “national income.” His purpose was to discover how much England could raise in tax revenues, and, by extension, how long it could continue to finance war with Holland.

If you had asked Hoover how “the economy” was doing, he would have given you a puzzled look.

“Economy” isn’t really a thing, after all – it’s an idea, and that idea had yet to be invented.

In 1931, Congress called together the country’s leading statisticians and found them unable to answer even the most basic questions about the state of the nation. A few months earlier, President Hoover had dispatched a number of Commerce Department employees around the country to report on the situation.

Congress wasn’t reassured, however. In 1932, it appointed a brilliant young Russian professor by the name of Simon Kuznets to answer a simple question: How much stuff can we make? Over the next few years, Kuznets laid the foundations of what would later become the GDP.

All across the world, economists began to play a dominant role in politics. Most were educated in the United States, the cradle of the GDP, where practitioners pursued a new, scientific brand of economics revolving around models, equations, and numbers.

When people around 1900 talked about “the economy,” they usually just meant “society.”

“The first thing you do in 1950s and’ 60s if you’re a new nation is you open a national airline, you create a national army, and you start measuring GDP.”

Initially, the more common measure was the gross national product (GNP), but in the 1990s this was superseded by the GDP. The GNP adds up all a country’s economic activity (including activities abroad), while the GDP adds up all activities within its borders (including by foreign enterprises).

“The gross national product … measures everything … except that which makes life worthwhile,” said Robert Kennedy.[4]

We need a good dose of irritation, frustration, and discontent to propel us forward. If the Land of Plenty is a place where everybody is happy, then it’s also a place steeped in apathy.

“Discontent,” said Oscar Wilde, “is the first step in the progress of a man or a nation.”

Two candidates are the Genuine Progress Indicator (GPI) and the Index of Sustainable Economic Welfare (ISEW), which also incorporate pollution, crime, inequality, and volunteer work in their equations.

Some things in life, like music, resist all attempts at greater efficiency.

“We can afford to pay more for the services we need – chiefly healthcare and education,” Baumol writes. “What we may not be able to afford are the consequences of falling costs.”

As the writer Kevin Kelly says, “Productivity is for robots. Humans excel at wasting time, experimenting, playing, creating, and exploring.” Governing by numbers is the last resort of a country that no longer knows what it wants, a country with no vision of utopia.

To be able to fill leisure intelligently is the last product of civilization. Bertrand Russell[5]

A Fifteen-Hour Workweek

Had you asked the greatest economist of the twentieth century what the biggest challenge of the twenty-first would be, he wouldn’t have had to think twice. Leisure.

Keynes was neither the first nor the last to foresee a future awash in leisure. A century and a half earlier, American Founding Father Benjamin Franklin had already predicted that four hours of work a day would eventually suffice.

Yet the Industrial Revolution, which propelled the nineteenth century’s explosive economic growth, had brought about the exact opposite of leisure.

Henry Ford had discovered that a shorter workweek actually increased productivity among his employees. Leisure time, he observed, was a “cold business fact.”

After World War II, leisure time continued its steady rise. In 1956, Vice President Richard Nixon promised Americans that they would only have to work four days a week “in the not-too-distant future.” The country had reached a “plateau of prosperity,” and he believed a shorter workweek was inevitable.

In the 1980s, workweek reductions came to a grinding halt. Economic growth was translating not into more leisure, but into more stuff. In countries like Australia, Austria, Norway, Spain, and England, the workweek stopped shrinking altogether.

Asimov may have been right that by 2014 “work” would be the most glorified word in our vocabulary, but for a completely different reason. We aren’t bored to death; we’re working ourselves to death.

There are strong indications that in a modern knowledge economy, even forty hours a week is too much. Research suggests that someone who is constantly drawing on their creative abilities can, on average, be productive for no more than six hours a day.

What does working less actually solve?

  • Stress? Countless studies have shown that people who work less are more satisfied with their lives.
  • Climate change? A worldwide shift to a shorter workweek could cut the CO2 emitted this century by half.
  • Accidents? Overtime is deadly.
  • Unemployment? Obviously, you can’t simply chop a job up into smaller pieces. Nevertheless, researchers at the International Labour Organization have concluded that work sharing – in which two part-time employees share a workload traditionally assigned to one full-time worker – went a long way toward resolving the last crisis.
  • Emancipation of women? Countries with short workweeks consistently top gender-equality rankings.
  • Aging population? An increasing share of the older population wants to continue working even after hitting retirement age.
  • Inequality? The countries with the biggest disparities in wealth are precisely those with the longest workweeks.

In his classic book The Theory of the Leisure Class (1899), the sociologist Thorstein Veblen still described leisure as the badge of the elite. But things that used to be categorized as leisure (art, sports, science, care, philanthropy) are now classed as work.

We can handle the good life, if only we take the time.

Work is the refuge of people who have nothing better to do. Oscar Wilde (1854 – 1900)[6]

 Why It Doesn’t Pay to Be a Banker

Take the slick Wall Street traders who line their pockets at the expense of another retirement fund. Take the shrewd lawyers who can draw a corporate lawsuit out until the end of days. Or take the brilliant ad writer who pens the slogan of the year and puts the competition right out of business. Instead of creating wealth, these jobs mostly just shift it around.

How is it possible that all those agents of prosperity – the teachers, the police officers, the nurses – are paid so poorly, while the unimportant, superfluous, and even destructive shifters do so well?

The U.S. financial sector is seven times as large as its agricultural sector.

“CLOSURE OF BANKS.” On May 4, 1970, this notice ran in the Irish Independent. Ireland’s bank employees decided to go on strike. In the end, the strike would last a whole six months.

In no time, people forged a radically decentralized monetary system with the country’s 11,000 pubs as its key nodes and basic trust as its underlying mechanism. By the time the banks finally reopened in November, the Irish had printed an incredible £ 5 billion in homemade currency.

According to historians, the reason the Irish were able to manage so well without banks was all down to social cohesion.

The bottom line is that wealth can be concentrated somewhere, but that doesn’t also mean that’s where it’s being created.

If the post-war era gave us fabulous inventions like the washing machine, the refrigerator, the space shuttle, and the pill, lately it’s been slightly improved iterations of the same phone we bought a couple years ago.

In fact, it has become increasingly profitable not to innovate. Imagine just how much progress we’ve missed out on because thousands of bright minds have frittered away their time dreaming up hypercomplex financial products that are ultimately only destructive. Or spent the best years of their lives duplicating existing pharmaceuticals in a way that’s infinitesimally different enough to warrant a new patent application by a brainy lawyer so a brilliant PR department can launch a brand-new marketing campaign for the not-so-brand-new drug. Imagine that all this talent were to be invested not in shifting wealth around, but in creating it.

For every dollar a bank earns, an estimated equivalent of 60 cents is destroyed elsewhere in the economic chain. Conversely, for every dollar a researcher earns, a value of at least $ 5 – and often much more – is pumped back into the economy.

All the big debates in education are about format. About delivery. About didactics. Education is consistently presented as a means of adaptation – as a lubricant to help you glide more effortlessly through life.

The focus, invariably, is on competencies, not values. On didactics, not ideals. On “problem-solving ability,” but not which problems need solving.

Which knowledge and skills do today’s students need to get hired in tomorrow’s job market – the market of 2030? Which is precisely the wrong question.

Instead, we should be posing a different question altogether: Which knowledge and skills do we want our children to have in 2030?

The goal of the future is full unemployment, so we can play. Arthur C. Clarke (1917 – 2008)[7]

Race Against the Machine

Robots. They have become one of the strongest arguments in favor of a shorter workweek and a universal basic income. In fact, if current trends hold, there is really just one other alternative: structural unemployment and growing inequality.

In the same way that transistors became the standard unit of information in the late 1950s, shipping containers once upon a time became the standard unit of transport.

The advent of the chip and the box made the world shrink as goods, services, and capital circled the globe ever more rapidly. Technology and globalization advanced hand in hand and faster than ever. Then something happened – something that nobody had imagined possible.

Back in 1957 the economist Nicholas Kaldor outlined his six famous “facts” of economic growth. The first was: “The shares of national income that go toward labor and capital are constant over long periods of time.” The constant being that two-thirds of a country’s income goes into the paychecks of laborers and one-third goes into the pockets of the owners of capital – that is, the people who own the stock shares and the machines. Generations of young economists had it drilled into their heads that “the ratio of capital to labor is constant.” Period. But it’s not.

Today only 58 % of industrialized nations’ wealth goes to pay people’s salaries.

Various factors are involved, including the decline of labor unions, the growth of the financial sector, lower taxes on capital, and the rise of the Asian giants. But the most important cause? Technological progress.

Innovations in Silicon Valley trigger mass layoffs elsewhere.

In the age of the chip, the box, and Internet retail, being just fractionally better than the rest means you’ve not only won the battle, you’ve won the war. Economists call this phenomenon the “winner-take-all society.”

The reality is that it takes fewer and fewer people to create a successful business, meaning that when a business succeeds, fewer and fewer people benefit.

If you look at the year 1800, some 74 % of all Americans were farmers, whereas by 1900 this figure was down to 31 %, and by 2000 to a mere 3 %.

The new generations of robots are proxies not only for our muscle power, but for our mental capacity, too.

It was at the beginning of the First Machine Age that textile workers in central and northern England rose up in rebellion, taking their name from the movement’s mythical leader Ned Ludd, who was supposed to have smashed two looms in a fit of rage in 1779. Because labor unions were outlawed, the Luddites opted for what the historian Eric Hobsbawm calls “negotiation by riot.”

The Luddite rebellion, at its height around 1811, was brutally crushed.

If people used to judge each other on their parentage, now it’s the diplomas on their wall. As long as machines can’t go to college, a degree offers higher returns than ever.

Just as we adapted to the First Machine Age through a revolution in education and welfare, so the Second Machine Age calls for drastic measures. Measures like a shorter workweek and universal basic income.

It is not technology itself that determines the course of history. In the end, it is we humans who decide how we want to shape our destiny.

Redistribution of money (basic income), of time (a shorter working week), of taxation (on capital instead of labor), and, of course, of robots.

Not long ago, the French economist Thomas Piketty had people up in arms with his contention that if we continue down our current path we’ll soon find ourselves back in the rentier society of the Gilded Age.

For hundreds of years the return on capital was 4 – 5 %, while annual economic growth lagged behind at under 2 %.

All the standard options – more schooling, regulation, austerity – will be a drop in the bucket. In the end, the only solution is a worldwide, progressive tax on wealth, says Professor Piketty, though he acknowledges this is merely a “useful utopia.”

Beyond the Gates of the Land of Plenty

The Western world spends $ 134.8 billion a year, $ 11.2 billion a month, $ 4,274 a second on foreign development aid.

Over the past fifty years, that brings us to a grand total of almost $ 5 trillion.

According to a study done by the World Bank, 85 % of all Western aid in the twentieth century was used differently than intended.

A randomized controlled trial, or RCT is a way how to measure success of investments. The first RCT of foreign development aid didn’t happen until 1998.

American professor named Michael Kremer have the insight to investigate the effects of free textbooks on Kenyan grade-school pupils.

Kremer’s was a landmark experiment. Since then, a veritable randomization industry has grown up around development aid, led by the aptly nicknamed “randomistas.” These are researchers who have had enough of the intuition, gut feelings, and ideological bickering of ivory-tower scholars about the needs of people struggling in Africa and elsewhere. What the randomistas want is numbers – incontrovertible data to show which aid helps, and which doesn’t. And the chief randomista? She’s a petite professor with a strong French accent. Esther Duflo.

They set up an RCT in Kenya in which one group of people got a net for free and the other only got a discount. The people who got nets at no charge actually proved twice as likely to purchase a new net as those who paid $ 3 the first time around. “People do not get used to handouts,” Duflo succinctly points out. “They get used to nets.”

The randomistas don’t think in terms of models. They don’t believe humans are rational actors.

Doing randomized controlled trials in poverty-stricken countries is difficult, time consuming, and expensive. RCTs across the globe have shown that over both the long and short term and on both a large and small scale, cash transfers are an extremely successful and efficient tool.

The time has come to put paid to what Duflo calls the three I’s of development aid: Ideology, Ignorance, and Inertia. “I don’t have many opinions to start with,” she said in an interview a few years ago. “I have one opinion – one should evaluate things – which is strongly held. I’m never unhappy with the results. I haven’t yet seen a result I didn’t like.”

The OECD estimates that poor countries lose three times as much to tax evasion as they receive in foreign aid.

Four different studies have shown that, depending on the level of movement in the global labor market, the estimated growth in “gross worldwide product” would be in the range of 67 % to 147 %. Effectively, open borders would make the whole world twice as rich.

On the eve of World War I, borders existed mostly as lines on paper. Passports were rare and the countries that did issue them (like Russia and the Ottoman Empire) were seen as uncivilized.

In this era of “globalization,” only 3 % of the world’s population lives outside their country of birth. The world is wide open for everything but people.

Economic growth isn’t a cure-all, of course, but out beyond the gates of the Land of Plenty, it’s still the main driver of progress.

Right, a mere eight people are richer than 3.5 billion put together.

In the twenty-first century, the real elite are those born not in the right family or the right class but in the right country.

In the 1960s, millions of Mexicans crossed it, but in time 85 % returned home. Since the 1980s, and especially since 9/11, the U.S. side of the border has been heavily militarized, with a 2,000 – mile wall secured by cameras, sensors, drones, and 20,000 border patrol agents. Nowadays, only 7 % of illegal Mexican immigrants ever go back.

Opening our borders is not something we can do overnight, of course – nor should it be. Unchecked migration would certainly corrode social cohesion in the Land of Plenty. But we do need to remember one thing: In a world of insane inequality, migration is the most powerful tool for fighting poverty.

Humans didn’t evolve by staying in one place. Wanderlust is in our blood. Go back a few generations and almost everybody has an immigrant in the family tree.

How Ideas Change the World

“A man with a conviction is a hard man to change.” So opens Leon Festinger’s account of these events in When Prophecy Fails, first published in 1956 and a seminal text in social psychology to this day. “Tell him you disagree and he turns away,” Festinger continues. “Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.”

When reality clashes with our deepest convictions, we’d rather recalibrate reality than amend our worldview.

The thing is, we know that ideas have changed over time. Yesterday’s avant-garde is today’s common sense.

The question is not can new ideas defeat old ones; the question is how.

A worldview is not a Lego set where a block is added here, removed there. It’s a fortress that is defended tooth and nail, with all possible reinforcements, until the pressure becomes so overpowering that the walls cave in.

If there were ever two people who dedicated their lives to building castles in the sky with preternatural certainty that they would someday be proven right, it was the founders of neoliberal thought.

Nowadays, “neoliberal” is a put-down leveled at anybody who doesn’t agree with the left. Hayek and Friedman, however, were proud neoliberals who saw it as their duty to reinvent liberalism.

This particular story begins on April 1, 1947, not quite a year after Keynes’ death, when forty philosophers, historians, and economists converged in the small village of Mont Pèlerin in Switzerland. Some had traveled for weeks, crossing oceans to get there. In later years, they would be known as the Mont Pèlerin Society. In the 1970s, Hayek handed the presidency of the Society over to Friedman.

The crisis came in October 1973, when the Organization of Arab Petroleum Exporting Countries raised oil prices by 70 % and imposed an oil embargo on the U.S. and The Netherlands. Inflation went through the roof and the Western economies spiraled into recession. “Stagflation,” as this effect was called, wasn’t even possible in Keynesian theory. Friedman, however, had predicted it.

It would seem that we have arrived at “the end of history,” with liberal democracy as the last stop and the “free consumer” as the terminus of our species.

But the question is, what is the value of free speech when we no longer have anything worthwhile to say? What’s the point of freedom of association when we no longer feel any sense of affiliation? What purpose does freedom of religion serve when we no longer believe in anything?

On the other hand, it’s high time that we, the inhabitants of the Land of Plenty, staked out a new utopia.

A fifteen-hour workweek, universal basic income, and a world without borders… They’re all crazy dreams – but for how much longer?

Epilogue

Where politics acts to reaffirm the status quo, Politics breaks free.

It was Joseph Overton, an American lawyer, who first explained the mechanisms of uppercase Politics in the 1990s. He began with a simple question: Why is it that so many good ideas don’t get taken seriously?

To make the radical reasonable, you merely have to stretch the bounds of the radical.

The worldview of the underdog socialist is that the neoliberals have mastered the game of reason, judgment, and statistics, leaving the left with emotion. Its heart is in the right place. Underdog socialists have a surfeit of compassion and find prevailing policies deeply unfair.

The underdog socialists’ biggest problem isn’t that they’re wrong. Their biggest problem is that they are dull. Dull as a doorknob. They’ve got no story to tell, nor even any language to convey it in.

Sadly, the underdog socialist has forgotten that the story of the left ought to be a narrative of hope and progress.

The greatest sin of the academic left is that it has become fundamentally aristocratic, writing in bizarre jargon that makes simple matters dizzyingly complex.

It all starts with reclaiming the language of progress.

  • Reforms?
  • Meritocracy?
  • Innovation?
  • Efficiency?
  • Cut the nanny state?
  • Freedom?

[1] In the book on page 32

[2] In the book on page 76

[3] In the book on page 107

[4] In the book on page 117

[5] In the book on page 126

[6] In the book on page 154

[7] In the book on page 176

You may also like
Kai-Fu Lee: AI Super-powers: China, Silicon Valley and the new world order

Leave a Reply