The 20th century, it is said, was a short one. Birthed in the trenches of Verdun after 1914 and buried alongside the Soviet Union around 1990, it lasted a mere 76 years. Contrast that, for example, with the previous century, described by the eminent (and recently deceased) historian Eric Hobsbawm as the “long 19th century”: the period between the French Revolution in 1789 and the outbreak of World War I in 1914. Historians pick these dates – rather than following the rhythm imposed by the calendar – because they carry historical meaning: They signal epochal shifts and provide a structure to the steady flow of history. Our Gregorian calendar is itself rooted in one such monumental event: the birth of a carpenter’s son, whose legacy first inspired worshippers in the Middle East and later left its mark on much of the history of the West and, indeed, of the world.
By that standard, the 21st century arrived almost on time. From 1989 until 2001, the End of History seemed nigh. After a century that had been pockmarked by mass violence and hemispheric conflict, liberalism’s triumph appeared inevitable. Anomalies such as the Castro brothers in Cuba, who would soon succumb to old age, or China, which would eventually dismantle its centralized structure to remain competitive in a globalizing world, were at best regarded as temporary hickups. Iron curtains fell alongside trade barriers and tariffs as the Western model of development was transported into the far corners of the earth by the IMF, the World Bank, Coca Cola, and whoever else decided to join the carnival. It was the Washington Consensus going full throttle.
Then 9/11 happened. The British tabloid “The Sun” declared on its front page on September 12, 2001: “The Day that Changed the World.” Liberalization discourses were replaced by security discourses. The years after 2001 seemed to confirm the fears of pessimists like Sam Huntington, who had warned that the Cold War wouldn’t be replaced by an era of peace and prosperity but by religion and cultural conflicts – the Clash of Civilizations. Much of the West embarked on the twin project of hunkering down under the protection of newly created or newly empowered institutions like the American Department of Homeland Security, and of sending its sons and daughters out into the world to fight the good fight. The new US Army Counterinsurgency Field Manual provided the practical blueprint; the TV series “24” served as the appropriate cultural echo chamber.
Thirteen years later, that analysis is beginning to fray. 9/11 ruptured the self-assured optimism of the 1990s, especially (and understandably) in the United States, but the epochal shift towards global contestation and dirty warfare hasn’t quite happened. We’re lucky in that regard. Terrorism hasn’t so much proliferated as it has been kept alive by insurgencies and the military responses they provoked. Much of the terrorist violence directed against the West since 2001 can directly be attributed to the event of 9/11 and to the two wars that we fought in its wake.
Now Iraq and Afghanistan are winding down, and President Obama has acknowledged the long-overdue shift away from a diffuse fight “against terror.” More than a decade after 9/11 we don’t find ourselves locked in battle against an ideological or religious enemy. Indeed, annual deaths from terrorism are lower today than during most of the 20th century. The price that most of us have paid isn’t inflicted by an external enemy but imposed through the expansion of our internal security apparatuses. One of the reasons why people on the Left – and some small-state liberals – have reacted so strongly to recent revelations about the scope of government surveillance programs is that the threat level which was initially invoked to justify the passage of legislation like the “Patriot Act” or the “FISA Amendment Act” seems hardly plausible today.
Seventy years of growth
But amidst the war on terror, when few people seemed to pay attention, another thing happened. Lehman Brothers collapsed. Of course one could pick a different event, such as the US Treasury’s virtual takeover of Freddie Mac, or the revelations about the Greek deficit after the country’s 2009 elections. But the collapse of Lehman Brothers has emerged as the most visible event of the financial crisis, and as a lightning rod it will suffice. Important is not the event in itself, but the historical shift it signals. In this case, the shift is away from the economic paradigm which has sustained much of the world’s growth and economic policies since at least 1945.
As I am writing this, 300,000 people are marching in Rio de Janeiro against corruption and public sector cuts ahead of next year’s soccer World Cup. It must be bad if Brazilians start anti-soccer riots. In Greece, protests have been ongoing for several years. In Spain, Italy and Portugal, popular discontent has ousted several governments and continues to cause a headache for their successors (in Greece, the government coalition is crumbling right now). In Great Britain, cuts to the National Health Service inspired regular demonstrations and a special segment during the Olympic opening ceremony in 2012, which defiantly celebrated the NHS as one of the great achievements of modern British society. Look at any newspaper front page today, and you are likely to see one or more photos of police in riot gear, shooting tear gas into crowds of protesters.
The decades since World War II have brought unprecedented increases in expectations about living standards in much of the world. I’m saying “expectations” because the actual increase in wealth and living standards has often been highly stratified: Those at the top benefit the most, sometimes at the expense of those at the bottom. In many countries of the Global North, inequality has increased significantly since the 1970s as real wages have stagnated or declined for many income groups. In the countries of South America, Asia, and parts of Africa, living standards for the middle class have increased somewhat, but the amount of money accumulating at the top meant that middle class expectations often continued to outpace actual improvements. No wonder, then, that protests in Rio have driven a broad cross-section of the population into the streets. As the BBC’s Paul Mason reminds us, the chances for upheaval are much higher when the middle class grows frustrated: “Even where you get rapid economic growth, it cannot absorb the demographic bulge of young people fast enough to deliver rising living standards for enough of them.”
Additionally, much of the economic growth since 1945 has been financed not through real growth but through credit. It’s “buy now, pay later” but on a global scale. For most of the 19th and 20th centuries, Western economies could avoid stagnation by amping up productivity, expanding domestic production and consumption and, when that leveled out, by tapping into new markets abroad. At its core, imperialism was an economic strategy. But where do you go once the world has been thoroughly globalized and the rate of productivity increase is declining? The answer, formulated even before the guns of World War II had fallen silent, was to go virtual: Growth would henceforth rely on credit, which could provide the necessary liquidity for investments and purchases. The Bretton Woods conference in 1944 marked an important turning point in the virtualization of monetary policy; the introduction of the first consumer credit cards in the US in the early 1950s virtualized personal consumption.
We’re now facing the consequences of that trend: Rising public and private debt are the price we have to pay for the economic miracle of the late 20th century. Instead of experiencing a proliferation of terrorism, we are witnesses to a proliferation of resistance against the standard economic model of the last sixty years. It’s an endogenous shock, not an exogenous one: It emerges from within the current system after seventy years of unparalleled growth. As a result, it poses a much bigger challenge: Hunkering down won’t help as long as the economic foundations remain fragile.
An interlude before the real shock
The confluence of public and private debt leads to the unfortunate situation that welfare budgets are being slashed in the name of austerity while private citizens are are themselves feeling the debt squeeze. It’s true in Brazil, true in Spain, and true in Greece as well.
From the perspective of debt, we haven’t quite hit rock bottom yet: Countries with the highest public debt-to-GDP ratios often have lower rates of household debt that can help to cushion the effects of public spending cuts. For example, both Italy and Japan are examples of countries with high levels of public debt but without a household debt bubble. But imagine what happens if US deficit reduction plans go through over the coming decade: the United States already already has one of the highest private debt levels in the world, largely thanks to mortgages and student debt. While automation – resulting, ideally, in rising productivity and the outsourcing of menial labor to machines – might offer a partial way out of the debt trap, it’s difficult to see how debt-financed growth could be sustained unchanged for another fifty years. As the system begins to crumble, the weakest links in the chain go bust first: those who rely on a functioning welfare state for their existence and socio-economic opportunities. Next come those whose dreams have been thwarted and whose expectations have been left unfulfilled. It’s still working fine if you’re at the top, but the socio-economic contrast has been broken.
No wonder, then, that a sense of unfairness and powerlessness has taken root and is now driving people into the streets across national and continental borders. The discontent with representative politics – still the least bad option we’ve got in many regards – is rooted not so much in the alleged gap between “the people” and “the politician” but in the steadfast insistence that a few piecemeal changes will suffice to straighten out the crisis.
The language employed by protesters in Rio, Athens, and Madrid – against “corruption,” against “top-down politics,” against “welfare squeezes” – speaks to a commonality of experience that transcends the particularities of each context: Economic policies are broken, and politics seems unable to provide a fix. It’s thus misleading to think of the crisis of the last five years primarily as an “economic crisis.” Sure, it all started with a downward cascade in the financial and mortgage markets, but it has long since morphed into a social and political crisis. It is sure to leave its mark not only on economic history or on the history of a specific country, but on history as such.
All of which now brings us back to the story of the 21st century. Forget what you’ve heard about terrorism: the years between 2001 and 2008 weren’t so much the start of a decennial or centennial trend of civilizational conflict but an historical anomaly; an interlude to a much larger story that has become clear only with the benefit of hindsight. The big shift of our time, the epochal change that affects or will affect billions of people around the globe, isn’t the rising threat of terrorism, but the rising precariousness of economic realities. The story of the 21st century begins in earnest with ticker news about imminent financial collapse.
Read Newest From Column Martin Eiermann: In Defense of Mainstream Journalism