|
09.08.2023

We're early in history—if we change our priorities

First published in:
Subject

It is our moral responsibility to act in line with the interests of those yet to be born.

Download

Ki-generated illustration from Midjourney

Main moments

! 1

! 2

! 3

! 4

Content

What do we owe the future?

Which group is most often ignored in policy development? Is it young? Or maybe old ones? I think the answer is future generations. And then I think not only of our children's children, but of all the thousands of generations that can live -- if we play our cards right.

In his book What We Owe the Future, Oxford philosopher Will MacAskill asks himself which side we are on, if all of human history had been one book. We've already been around for 300,000 years. Nevertheless, it is quite possible that we are still on the first page. If each page represents about as many people who have lived to date, about a hundred billion, the narrative of humanity—even if we add conservative estimates to the ground—could be tens of thousands of pages.

That the future may be vast should make us rethink whether or not we, as individuals, societies, and civilizations, have our priorities in order.

Are future lives worth as much?

A natural question is whether we should care about the distant future. There can be two sources of doubt. One is practical. Can we know anything about what the people of the future need from us? We'll come back to that one. Another objection is more fundamental: Do we owe humans far into the future anything?

The answer to that is yes. To use one of MacAskill's examples, imagine you're on a forest walk and smash a glass bottle on a trail. Barely half an hour earlier, you passed some children you fear will cut themselves on the shards of glass if you don't clean up after you. It is clear that you should clean it up. But does this become any less clear if the children who are going to cut themselves are not just around the corner, but rather come running barefoot in a year, or in a hundred?

The No. Good is good and evil is evil, whether they occur today or in a thousand years. Because of this, it's not just the size of the future that has the potential to be huge. It is also the value of the future. To the extent that our actions can affect the lives of those who will live in the future, it will have a colossal significance whether our influence is good or bad.

Therefore, if we want to create a better world, we should focus considerably more than what we do today on how our actions affect the long future. This is what MacAskill calls longtermism—or what we might call long-term ethics in Norwegian.

What can be done?

Let's return to practice. Even if one buys the claim that we have strong moral grounds to positively influence the future it is not obvious how we should go forward to achieve this. It is hard enough to predict the consequences of our actions in the near future—trying to say anything about the consequences thousands of years in time can seem hopeless. But we are not as powerless as one might think. Namely, there are several measures that can drastically increase the expected value of the future.

One is to make sure the future is good. To that end, it is essential not to, as MacAskill calls it, 'lock in' the wrong values. Imagine that the values and norms of the world in 1822 were permanently locked. Then the saga of humanity could consist of tens of thousands of pages of slavery and extreme oppression of women, rather than just one such page. I think in 2022 we have moral blind spots that future generations will look back on with similar disgust as we look at slavery. One such blind spot MacAskill and others point to is beginning to become apparent to many, namely our industrial animal husbandry, but we should be open to the fact that we have more. We can avoid locking in wrong values and promote moral progression by safeguarding a culture of value and political diversity that encourages continuous criticism of prevailing dogma.

The second broad category of measures consists in increasing the chance that there is a future at all. About a week before the corona pandemic seriously hit the world, MacAskill's colleague Toby Ord published the book The Precipice, in which he warns of so-called existential risks facing humanity. These are outcomes that either wipe out all human civilization or permanently destroy its long-term potential. Ord estimates the probability of an existential catastrophe occurring in the next hundred years to be 1 in 6—like playing Russian roulette. Probable causes Word points to include nuclear war, man-made pandemics and uncontrolled artificial intelligence. MacAskill thinks the chance of an existential catastrophe this century is somewhat lower, and figures peg it at 1 in 100. This is about the same probability that an average American has of dying in a car accident—a risk we as a society and individuals work hard to minimize. Despite the uncertainty surrounding the exact figures, the message is clear: we should spend much more resources on preventing existential disasters.

With astronomical effort

An existential catastrophe this century does not just mean that those of us who are alive then get robbed of the rest of our lives. It also implies that trillions of good lives that could have been lived never get the opportunity. The stakes are thus astronomically much higher than we usually think. It was this insight that led the intellectual godfather of long-term ethics, philosopher Derek Parfit, to point out that the difference between a catastrophe in which all people die and one in which 99% die is morally far greater than the difference between an outcome in which 99% die and one in which no one dies.

Those who have not been born yet cannot, for natural reasons, participate in current decision-making. Although market transactions and policy decisions influence them, they do not get to utter their preferences. It is therefore our moral responsibility to act in line with their interests. If we manage that we will be able to be on the first page in a page-turner that just gets better and better.

Download