Many have asked for my thoughts on the U.S. elections. After taking time to reflect, my analysis is presented below. It’s a long read, but I hope it offers some value as we chart a path forward for democracy, human rights and equality.
On August 6 of this year, I appeared on Polish television and was asked who Kamala Harris would choose as her running mate. “Pennsylvania figures to be a crucially important state,” I told the program host. “So, I would not be surprised if Vice President Harris opts to select the governor of Pennsylvania in order to try and sure up that critical state.”
I was wrong; the next day Harris announced Tim Walz. And, I was wrong about the importance of Pennsylvania; had Harris won it, Trump still would have won the election by 67 electoral college votes.
That’s because this election was not about one or two states. It was not about the “economy” or “inflation.” Rather, this election marked the conclusion of a process whereby society re-organized itself, with one superstructure becoming permanently unwound and a new one crystallizing in its place. Such a process did not happen in one day; it had been unfolding over the past decade. But Tuesday was the inflection point, the moment of no return.
Simply put, November 5, 2024, was the night the 20th century ended. Allow me to use this week’s newsletter to articulate why I feel this way, and what we can do to shape our new world so that it reflects the values we hold dear.
I’m currently teaching a course on the Middle East, and in the course, I’ve suggested to my students that there was a “Long 19th Century.” Even if the 19th century technically lasted from 1800 to 1900, I’ve suggested to my students that it actually began with the French Revolution in 1789, and ended with the conclusion of World War I and the Paris Peace Conference in 1919.
The argument rests on the premise that centuries are not solely defined by dates, but also by the interconnected technologies, institutions and ideologies that shape people’s decisions and world events. Those technologies, institutions and ideologies don’t simply disappear when the calendar turns. (The challenges of December 1899 didn’t end in January 1900). They remain in place until new events, new technologies, new people and new ideas slough off the old world and forge a new one.
The “Long 20th Century,” then, began in 1920, emerging out of the wreckage of WWI. That war killed more than 20 million people, including men blown apart on the battlefield, entire communities slaughtered and starved, and a global population ravaged by disease. From the carnage emerged new nation-states born from dissolved empires (the Ottoman, Austro-Hungarian and Russian empires all fell during the war), and a nascent belief (highly contested, of course) that such nations must cooperate within international structures in order to stave off the brutal horrors that humans were capable of. Nations, President Wilson wrote in his Fourteen Points, had intrinsic rights to “territorial integrity,” and force should not be used to change their borders. Rather, countries should rely on open diplomacy to work through their challenges.
The idea failed; the League of Nations could not prevent the horrors of the 1930s and 1940s, including Japanese crimes in China and the genocide of Jews and Roma by the Nazis. But its ideals were reinvigorated after the Second World War—a conflict that killed more than 50 million people—and reconstituted in the United Nations and other multinational institutions such as the World Bank, World Health Organization and NATO.
Imperfect and flawed as they were—and always rife with agendas, politics and hypocrisies—these institutions played major roles in propagating a set of beliefs that animated society in the ensuing decades. Those beliefs included the assumption that institutional structures, when properly funded and supported, could advance diplomacy, education, medicine and, ultimately, peace. They sparked the creation of USAID, the growth of the State Department, public diplomacy, the field of international development, the Peace Corps, Doctors Without Borders and a network of global governmental and non-governmental organizations founded on complementary ideals. The institutions were often led by the United States and other “Western” powers, but they were not relegated to them. In the Global South, for example, ideas of transnational solidarity for the purposes of peace, prosperity and economic development guided Pan-Arab and Pan-African movements, as well as the “Third World Project,” which attempted to build solidarity among nations that had formerly been colonized. As Indonesian statesman Sukarno said at the 1955 Bandung Conference, “We the 1.4 billion strong who are speaking with once voice can mobilize in favor of peace.”
These structures were undergirded by particular types of technologies. The technologies of the 20th century were mechanical and industrial, an orchestra of cogs, wheels, motors and engines. They were also big: large coal plants, giant oil rigs, huge freighters and airplanes, enormous newspaper presses, big factory floors, all churning and pumping in mechanized motions. This industrial arrangement allowed individual workers to be inserted into assembly lines and guide the machines, participating in the production (and consumption) of automobiles, consumer products, building materials, and physical electronics—creating massive amount of goods, massive amounts of wealth, and degrading the environment in planet-altering ways.
The model was linear, i.e., an assembly line with a beginning, middle and end. Those linear structures extended beyond industry to education and the workforce. School systems became assembly lines for students: enter in kindergarten, progress through a path of elevating grade levels, exit after high school. Universities and post-graduate studies were similar knowledge production factories, a metaphor used by the president of the University of California himself, Clark Kerr, in a 1963 speech and book that positioned the university as the centerpiece of the “knowledge industry” churning out PhDs and Masters degrees. As workers retired, new ones entered, a beautiful techno-educational machine in motion.
Large numbers of knowledge workers were necessary because the 20th century was built in an expert-centric manner. Subject matter experts staffed the growing diplomatic and international corps. Experts staffed the educational and academic ranks, who trained more experts to come after them. Expertise staffed the finance world, the corporate world, militaries, and governments. The scientific enterprise exploded, swelled with funding from governments and the private sector, embedding scientific experts across national security, the economy and healthcare. Even the mass media evolved into an expert-centric enterprise, with “reporters” morphing into “journalists” who gained prestige from their expertise in particular regions or subjects (think the award-winning Foreign Affairs Correspondent or the Senior Business Reporter). All of this was perpetuated by cycles of prizes and awards (Nobel Prizes, Pulitzer Prizes, World Food Prizes, etc.) wherein experts and institutions lauded their own excellence among other experts and institutions.
Finally, all of this was documented and transmuted through linear media forms. Films had beginnings, middles and ends; newspapers had sections that started at A1 and were organized linearly through sections B, C, and D. To watch a movie in a theater you were expected to arrive on time; to understand the end of a TV show you had to watch from the beginning. This is not meant to suggest the media landscape wasn’t complex or guided by corporate interests; read Marshall McLuhan or watch the 1976 film Network. But the forms were linear, and with them semi-predictable patterns of how consumers would behave and interact.
Still, the 20th century did not avoid perpetual war and destruction. Beyond World War I and World War II, wars in the Middle East, genocides in Cambodia and Rwanda, the massacre at Srebrenica, the Cultural Revolution in China, Soviet Imperialism, nuclear weapons, and environmental degradation all caused immense suffering. Yet, a belief persisted throughout the century—perhaps naively so—that expertise, diplomacy, institutions, global cooperative agreements and shared economic prosperity could keep humanity from destroying itself and the planet. The International Space Station, the Intergovernmental Panel on Climate Change, Earth Day, the Strategic Arms Limitation Talks... these were all transnational initiatives imbued with a 20th century logic that when we locked arms across the planet, we could devise solutions to our global challenges.
There was no single day where this logic came undone. Indeed, forces within the 20th century were fighting to undo it from the outset. Nothing in this argument is meant to suggest a uniformity of how 6 billion people lived from 1920 through the recent past.
Still, as the calendar turned to 2000, this organizing principle held. After the terror attacks of 9/11, NATO invoked Article 5 and a case went forward to the United Nations that the U.S. would retaliate against the Taliban and, later, Saddam Hussein. In retrospect, the Iraq War likely represented the beginning of the end of this world, a calamitous conflict that killed tens-of-thousands, lobbied for by the very institutions and experts meant to advance peace and diplomacy: the United Nations, the New York Times, etc. Amid the tumult of the Iraq War came the global financial crisis, where again the institutions and experts meant to promote stability and prosperity—the World Bank, the Federal Reserve, large banks—failed to protect the everyday person, with nary an entity facing repercussions for the collapse.
Concomitant to these global shocks emerged three massively disruptive technological forces: social media, Silicon Valley and Bitcoin. The social Web—Wikipedia, Facebook, YouTube, Twitter, etc.—seized communicative power away from experts and the legacy press, while simultaneously flooding our information ecosystem with new media forms. The infinite scroll may go down in history as the most consequential invention of our time, which when integrated into the mobile phone and connected to the Internet meant that we could each consume user-generated media ad infinitum: never a beginning, never an end, never in a straight line. We would always enter in the middle of the feed and leave before the feed had completed, hopping around from destination to destination through a dizzying maze of content and hyperlinks on devices designed to addict us and surveil us. Other media replicated the model: Netflix, Amazon Prime, and Apple TV became repositories of never-ending content serving up movie franchises such as Marvel or Star Wars that, themselves, never ended, an infinite scroll of interconnected stories, all powered by algorithms designed to keep us inextricably tethered to our screens.
This ecosystem privileged attention above all else, forging the incentives for staggering amounts of misinformation and disinformation. Accuracy mattered less amid millions of pieces of content every minute, for accuracy was no guarantee of visibility. What mattered was surprise, shock, subversion of expectations, defiance of standards and conventions… anything to attract eyeballs. The more eyeballs that could be attracted, the more power and influence could be gained, regardless of expertise or experience.
The technology that powered this new ecosystem was small, as opposed to large: small chips, small phones, small circuit boards, small batteries, laptops instead of desktops, wearable, disposable, portable and mobile. As always, our behaviors and tastes became more like the technologies we used: nimble and agile became preferred to static and durable. Why be tied to a permanent home, partner or career, when one could be portable, mobile, work remotely, and hop from job to job? Why be limited to a national currency when users could create their own virtual currencies that operated outside the confines of the existing financial system? Why allow the institutions of the past to set the boundaries of possibility when so many possibilities existed beyond those boundaries?
This sounds like a renaissance of freedom and liberty. But it’s important to remember that this was occurring amid other structural changes, namely the displacement of jobs and the hollowing out of manufacturing communities. While the previous technologies created jobs, the new technologies were displacing them. That displacement was felt acutely among the working classes, and meant that the rapid proliferation of new technologies was inextricably intertwined with an anger towards the institutions that had failed to intervene. The expert-centric institutions had become over-burdened with bureaucracy, unconcerned with the plights of working people, and arrogantly tone-deaf, it was argued. (Billionaires flying on private jets to Davos for the World Economic Forum while the Swiss Alps melted in the background became, perhaps, the most iconic juxtaposition.) On both the Left and the Right, a belief grew that these institutions principally existed to perpetuate their own privilege, and that liberation lay in user-centric technology, radical activism, and strident self-reliance: I am my own brand, my own retirement plan, and my own media ecosystem. As Elon Musk recently posted on X, “You, the individual, are the media now.” No institution can be more trustworthy than what I am able to know myself.
It is worth repeating that this unwinding took two decades to foment. If you read George Packer’s masterful 2013 book The Unwinding, you can see it taking shape. The Iraq War, the financial crisis, the Arab Spring, Brexit, Trump, the Rise of China, social media, disinformation… all have contributed. But the breaking points were the Covid-19 pandemic and the Russian invasion of Ukraine.
Perhaps the greatest legacy of the expert-centric institutions of the 20th century was mass vaccination. In the previous century, polio outbreaks—such as in New York City in 1916—killed thousands of people and infected tens of thousands more. In a post-polio vaccine world, two of the three wild polio viruses were eradicated and mass vaccinations saved the lives of millions. Likewise, measles used to kill thousands of people per year in the U.S.; after the introduction of the vaccine, the number of reported cases dropped by 80 percent. Yet amid distrust in expert-centric institutions, massive disinformation, and a belief that no institution can be more trustworthy than what I can know myself, mass vaccination became, at best, a form of oppression, and at worst, a giant conspiracy. As I documented last year in an article for Real Clear Politics, “In Europe, Disinformation is Winning,” nearly 25% of Germans were never vaccinated against coronavirus. In Bulgaria the unvaccinated rate was nearly 70% and in North Macedonia it was near 60%.
On the heels of the pandemic came the Russian invasion of Ukraine. Such an invasion flew in the face of President Wilson’s plea 100 years earlier for “territorial integrity” and open diplomacy. It placed NATO and the United Nations in a conundrum for which they have yet to devise a way out. Bring Ukraine into NATO and invoke Article 5—and risk nuclear war? Or send arms and funding, but place restrictions on how they can be used? Much like the League of Nations a century ago, the international institutions dedicated to preserving peace have been unable to stop the madman at the door intent on war.
President Biden was the last of the 20th century Presidents. A man whose Congressional career began in 1972, Biden came of age as an institutionalist. Despite all his flaws—and he had many—he held onto a 20th century logic that when we locked arms across the aisle or across the planet, we could devise solutions to our global challenges.
Ultimately, Biden was undone not by President-elect Trump, but by his own political party. The far-left flank had grown weary of his institutionalism, his careful diplomacy, his support for Israel, his reverence for old alliances and legacy structures. To quote one such activist who participated in his ouster, “I am ready to burn it all down.” Biden’s debate performance was an alibi for a plan that the ceasefire crowd had envisioned for much longer: something drastic to capture media attention and show the older generation that times had changed. Harris was viewed as the 21st century candidate—multiracial, female, Progressive, younger. But once in charge, she ran a campaign stuck in 20th century logic. As the former Chief of Staff to Nancy Pelosi, John Lawrence, noted in his reflection on the race, Harris resuscitated two key messages from earlier Democratic platforms: “we’re not going back” and giving people a “hand up. “It was a clunker in 2012,” Lawrence wrote, “and didn’t inspire undecided voters any better this time around.”
Most critically, Harris stumbled where Trump thrived. Trump smartly seized every moment to become a meme for the infinite scroll of social media—dancing to “Y-M-C-A” or working at McDonald’s. He embraced Bitcoin, locked arms with anti-vaxxer RFK, Jr., eschewed the legacy press, gained favor with iconoclasts such as Joe Rogan and Elon Musk, and promised that the jobs “they” took away would return under his leadership. Trump and his campaign learned from 2016 and 2020 that the old political model was dead and a new one had emerged. The electorate was cynical, angry, irreverent, anti-establishment and self-absorbed, and he would reflect that back to them. The candidate who promised to “burn it all down” would be the one the voters would hoist up.
So, where does this leave us as we head into 2025 and beyond?
To some the devolution of one world and the remaking of another is a cause for mourning—and indeed, during the past week that has been evident across legacy media and elements of civil society. For others, however, the new world is a cause for celebration—and not just political celebration. I have received several emails this week from people I know and respect eager to unlock an age of innovation and advancement: self-driving cars, open-source AI, personalized AI agents and new treatments for cancer and disease. The thinking is that over-reaching government regulation and stale, bureaucratic institutions have held society back (an underrated element of the election is how Biden’s war against cryptocurrencies, waged through SEC Chief Gary Gensler, factored into the anger felt by Trump supporters, Republicans and crypto-enthusiasts). The new generation of “bureaucracy-busters” such as the Trump-Musk coalition and Javier Milei in Argentina who will spur the renaissance society needs to unlock human flourishing.
Whether this is an opportunity or calamity, it is clear that democracy faces stiff headwinds in our new era. Freedom House has documented that in nearly every region of the world, democratic benchmarks are declining. Freedom of the press has been curtailed, those who oppose government are being jailed or murdered, corruption runs rampant, mass surveillance is being imposed, rights and liberties are being stripped, and wars are being waged that put civilians in harm’s way. Even in regions where democracy remains relatively strong—Europe, Australia, the U.S.—the liberal arts and humanities are being dismantled and disinformation and conspiracy theories run rampant (antisemitism, the world’s longest running conspiracy theory, has consumed Europe and the U.S. in the past year). Our new age is—for the moment—marked by cynicism, distrust, war, surveillance, conspiracy and stridency. Perhaps most ominously, it is marked by a distaste—even a disdain—for cooperation across political parties, religions, ethnicities, and nations.
So, what is the answer? Nostalgia will not redeem us; “the world has changed, and none of us can go back,” to quote Peggy Carter to Steve Rogers in the film Captain America: The Winter Soldier. Our task now, in my humble opinion, is to carry forward our achievements from the previous era and ingrain them into our new one. We know that vaccines are overwhelmingly safe and effective; we must continue to embrace them in order to safeguard public health. We know that our planet is warming beyond what humans and animals can inhabit; we must use our technology and international alliances to stem the damage before it is too late. We know that humans are capable of horrific crimes and abuses, so we must invest in education that protects our most vulnerable populations and promotes peace and tolerance. And we know that left to its own devices, a market without any regulation will create vast inequality, enriching a small few while leaving the vast majority behind. Smart policy frameworks that reward innovation, promote job creation, allow people to invest in their own education and up-skilling, provide affordable child and elderly care, strengthen the humanities, and offer a safety net for those facing hardship are all necessary to prevent millions from falling so far through the cracks that it becomes impossible to recover.
But our biggest task is to plug compassion, tolerance, peace, human rights, equal opportunity and democracy into a technological, political and media reality that often promote the opposite. This need has been apparent for some time, yet perhaps we did not insist upon it forcefully enough, thinking (naively) that the 20th century models could still endure even amid seismic shifts. It is time to recommit to these values, and I know, for my sake, there is much that I can do, including a rebranding of this newsletter, a reinvigoration of the History Communication Institute, forcefully combatting antisemitism and hate, and spreading the message of historical and media literacy farther than it has currently reached.
These are big responsibilities, but I owe a debt to the past and the future to undertake them. I hope you’ll join me.
-JS
Thank you. I am sad that I mostly agree with this wonderful article. I think that the foundations are shaky. While I think Trump is likely to enact policies that I think will cause harm, what I find really unbearable is not that he was elected, but that he could be elected. Mainly because this demonstrate that the system has failed to convince most people that it is worth keeping.
Very good, but one piece is missing: the new paradigm for competition. New digital technologies have next to zero marginal cost once the coding is done. Not having to move atoms around to meet a customer need, means that a "Gorilla Game" dictates speed to reach a dominant network position that creates enormous wealth for the few that work for the winners....Very different from the preceding economic models....