The 2050 Project
Bloggers and writers make predictions for the year 2050
In August I published “Futurists have their heads in the clouds” which made a set of predictions for 2050. The goal was to avoid the common failure mode of futurists, which is to morph into a sci-fi writer who can’t plot. This attempt ended up inspiring the Slime Mold Time Mold blog to organize the 2050 Project, recruiting a slew of other bloggers and writers to make their own predictions for the year 2050.
Such prognostications are extremely fun to read, likely because 2050 is close enough that, if we’re lucky, it will occur in many of our lifetimes. A mere 28 years? Just a few blinks, really. Enough space for a career, a child, the love of your life, a retirement, some travel, and then we’re there, waking up in those bleary 2050 mornings. The closeness of the year helps with realism, or at least, specificity—indeed, some of the predictions from my original post have ended up on the prediction market Metaculus. And yet, simultaneously, 2050 is just far enough away to be asking wild questions like “where’s my flying car?” (which is certainly something I myself am guilty of).
So here are some of my favorite predictions by others from the 2050 Project.
Over at Slime Mold Time Mold the forecasting methods
. . . add something to Hoel’s original method of extrapolating “technologies or trends already in their nascent phases”: regression to the mean. What we mean by this is, well — the 20th century was very unusual in many different ways.
A lot of things that we take for granted are really, really new — like 401ks (invented in 1978), Traditional (1974) and Roth (1997) IRAs, and modern credit scores (1989). Indexes like the Dow Jones and the S&P 500 run back several decades, but index funds that track them only appeared in 1972. In 1940, only 5% of US adults over 25 had a college degree and only 25% had a high school diploma. Even income tax wasn’t a permanent part of the US tax system until 1913 — we had to do a whole amendment to the Constitution to make it happen.
Some of these may be here to stay, but looking back from 2050, a lot of 20th century “institutions” will look like a flash in the pan.
Indeed, COVID-19 may be a harbinger of such historical regressions, like how
. . . we’ll have to start paying attention to disease in the way our ancestors did. As historian Ada Palmer describes, “I have never read a full set of Renaissance letters which didn’t mention plague outbreaks and plague deaths, and Renaissance letters from mothers to their traveling sons regularly include, along with advice on etiquette and eating enough fennel, a list of which towns to avoid this season because there’s plague there.”
The post speculates that this could be due to endemic COVID-19, growing antibiotic resistance, or thawing permafrost. I’d like to add another reason: advances in virology itself. While we don’t know the origin of COVID-19, it certainly could have been from an accident— indeed, last year there was a lab leak of COVID-19 in Taiwan by a researcher who got bit by a mouse she was studying. This led to the country’s first COVID-19 case in a month. And this sort of thing is common—there’s been six safety incidents involving coronaviruses just in the past five years at the University of North Carolina at Chapel Hill.
So in one of the worst outcomes for our future, the very people trying to stop plagues end up building a world-spanning network of laboratories that ensures them, either via low-probability but never-completely-avoidable lab leaks, or, alternatively, by creating and perfecting the tools needed for bad actors to easily make engineered viruses. If this came true, by 2050 expect the attitude toward science and scientists to have become considerably more negative.
Stephen Malina blogged his predictions for 2050 with a focus on AI, biotech, and crypto, imagining a future where AI is everywhere but many simple tasks remain unautomated.
Robotics has finally started to become useful as well, with robots increasing manufacturing productivity. That said, robots still aren’t capable of “messier” tasks at the level required for widespread adoption. For example, >95% of hedge clipping is still done by humans.
He describes a realistic version of future healthcare, positing it will be highly unequal in the distribution of advancements.
Direct-to-consumer personal health tracking has become even more common. Smart watches or their next iteration include glucose monitoring and other currently not available metrics. Outside of ‘omics screens, the healthcare system continues to mostly not integrate the more finely grained, time series data into its decision-making apparatus except for athletes, high net worth individuals who pay for personalized care, and individuals with specific issues (diabetes, cancer).
At Atoms vs. Bits Jehan had some interesting and quite specific predictions. Like this one, based on the
. . . reasons to believe that nearsightedness can be controlled environmentally. Only 1.2% of rural Nepalese children are myopic compared to 59% of East Asian 17-year olds in Australia.
This is likely caused by excessively looking at, and therefore focusing on, relatively nearby objects. Indoor environments tend to aggravate this tendency and there is evidence that time spent outdoors is effective at preventing myopia.
I predict that this finding will become well-established and widely recognized, resulting in parenting norms that prevent the development of nearsightedness. Allergies are currently undergoing a similar trend, where they are prevented or treated through exposure to allergens rather than pharmaceuticals. As this knowledge spreads among the educated, not needing glasses will become a class and cultural indicator, much the way straight teeth are now, which will drive adoption.
I actually disagree with Jehan here—not about the science or the cause of nearsightedness, but because I think screens are only going to become more common for children. And unlike allergies, glasses and contacts are extremely well-accepted culturally and don’t have much impact on quality of life. I just don’t see that changing. Indeed, I think the opposite: essentially everyone in America will be nearsighted by 2050. If you’d like to weigh on this, you can make a prediction over at Metaculus (they set up a question for me on myopia prevalence by 2050).
Almost no one predicted a full-blown dystopia. Some even made me hopeful. Over at Experimental History it was pointed out that people will still read books in the future, since
. . . people have reported reading about the same number of books for forty years. Other studies find small declines reading literature or reading for pleasure, but these seem to miss the rise in audiobooks (which now make more money than e-books) and might miss changes in what people read and why.
He also notes that
. . . people believe that morality has declined and how there’s very good evidence that it hasn’t. What’s especially remarkable is that people think the decline began when they arrived on Earth regardless of when that was. They even think kindness is dying off so quickly that you can tell the difference between today and a few years ago.
I’m not sure the evidence either way is that strong, but it certainly is a mark against the theory that each generation in its turn thinks morality, kindness, etc, have declined. Surely that can’t be true! But, then again, perhaps it is. For the same could be said of the value of the dollar. It’s always less, year after year.
Rohit at Strange Loop Canon explored how salaried jobs may eventually become an anachronism. He thinks the current shift way from physical offices
. . . will only increase this tendency. Right now the only folks who can do that are CEOs, already successful dilettantes, or self-employed consultants. But once you are no longer geographically bound, you can focus on doing a lot more of your intellectual work in a way that you enjoy. . . .
Our society already looks a little odd in that we have this at the top and the bottom of the income/status scales. At the top you have multiple board seats, non profit work, work on multiple projects, or just different income streams from investments. At the bottom you have the absolute necessity to have multiple jobs to ensure you have sufficient cashflow.
The middle group will catch up to this, through interest rather than necessity.
Sasha Chapin makes some humorous predictions of what will be happening in 2050. But perhaps correct, for would not I have laughed in 2000 at the idea a reality TV star would be president in 2016?
My children will book a family trip to Marrakech with an AI that will take care of the whole thing, and I will be indignant about its decisions when the staff in our hotel are rude, ranting about how this is why we did things ourselves in the old days. This is basically the most significant impact that AI will have on society.
He also thinks things will get weird culturally in a big over-correction.
Cancel culture will mint too many cancel coins, resulting in their devaluation. The world will wake up from its collective hallucination that gaffes matter; it will be not only acceptable, but desirable, for politicians to send consensual dick pics, leak their nudes, be seen smoking crack, etcetera.
Which would make the world a pretty zany place, compounded by how The San Diego Zoo will apparently have woolly mammoths.
Matt Clifford over at Thoughts in Between speculates that
. . . the 21st century will be a century of secessions. First, technological, economic and military change is making the “minimum viable state” smaller. . . Second, the long/short volatility divide discussed above is eroding the political-economic compact between big cities and their hinterlands that has (partly) held nation states together for the last couple of centuries.
Perhaps this will be accelerated by remote work.
It’s getting easier for the most talented workers to measure and capture their marginal product. What predictions does this produce? First, more entrepreneurs. A greater proportion of the most highly skilled individuals will structure their careers so that they’re paid as equity holders, not salary recipients. Second, more inequality (alas).
When I originally published “Futurists have their heads in the clouds” a number of commentators pointed out I hadn’t mentioned climate change. Max Nussenbaum does a better job, sketching a middle-of-the-road perspective that seems viable:
The U.S. won’t pass any significant climate legislation in the next decade. Technological innovation, as well as market and cultural pressures, will lead to a significant reduction in worldwide emissions, but we won’t hit net-zero by 2050. We’ll blow through the IPCC’s 1.5° warming target and end up on track to hit somewhere between 2 and 3° by the end of the century.
Although this will cause significant suffering worldwide, for most people in developed countries climate change will be closer to a nuisance than an apocalypse. Many will incorrectly see this as evidence that global warming wasn’t the big deal the climate Cassandras said it would be.
Finally, there’s Secretum Secretorum, which eschews the idea of extrapolating from current trends and instead takes some big contrarian positions. Of particular interest is the idea of a “World Historic Individual” emerging. Specifically that
. . . by 2050 there will a living person who is widely recognized to be what early 1900s German historian Oswald Spengler called a “World Historical Figure” (The Decline of The West). Jesus, Socrates, Alexander the Great, Buddha, Genghis Khan, Muhammed, Newton, Darwin are all on the list.
It’s always struck me that my generation, the millennials, have grown up with a particular lack of World Historic individuals (in terms of impact in the long march of history, not popularity or name recognition in the moment). We missed concurrency with most of the 20th-century luminaries. My parents’ early lives overlapped with Einstein, for instance. Indeed, it’s worth asking:
Who is the most recent person that could reasonably be called a world historical figures? I say yes for Gandhi and Martin Luther King, Jr. . . . After that. . . I’m not so sure. Off the top of my head, here’s a list of potential candidates: Mao Zedong, Osama Bin Laden, Obama, Trump, Elon Musk (sorry Bezos, you didn’t make the cut), and Xi Jinping. I think most of these people are debatable when you start to consider truly vast time horizons: what are the chances people will know about Obama, Xi, or Elon Musk 500 years from now?
Of those listed by Secretum Secretorum, the only individual I can imagine mattering in five hundred years is Elon Musk, although not for anything he’s done yet. But if he did establish a city on Mars, as there’s indeed a chance he might, getting humanity off-planet would certainly be remembered. Yet, it’s also possible the responsibility for the actual settlement of Mars will be spread out over the entire space industry, not to mention over various governments, ensuring no lasting historical credit goes to Musk.
So where are the Einsteins? Or Joans of Arc? The Ghandis? Let alone the Leibnizs or Napoleons or Christs? Perhaps there is someone unknown to us, some little girl waiting in the wings who will change everything. A comforting thought. For it is only in the brief periods when a World Historic individual bestrides the globe that humanity is no longer alone in the universe.
Right now we are children in a dark room, waiting for the hallway light to turn on and an adult to come save us. Some of us may live our whole lives in this dark.