Home » Posts tagged 'longtermism'

Tag Archives: longtermism

Olduvai
Click on image to purchase

Olduvai III: Catacylsm
Click on image to purchase

Post categories

Post Archives by Category

What “longtermism” gets wrong about climate change

 Corn affected by 2013 drought in Texas. Global warming is making summer droughts in Texas longer and more severe. USDA photo by Bob Nichols

In his new book What We Owe the Future, William MacAskill outlines the case for what he calls “longtermism.” That’s not just another word for long-term thinking. It’s an ideology and movement founded on some highly controversial ideas in ethics.

Longtermism calls for policies that most people, including those who advocate for long-term thinking, would find implausible or even repugnant. For example, longtermists like MacAskill argue that the more “happy people” who exist in the universe, the better the universe will become. “Bigger is better,” as MacAskill puts it in his book. Longtermism suggests we should not only have more children right now to improve the world, but ultimately colonize the accessible universe, even creating planet-size computers in space in which astronomically large populations of digital people live in virtual-reality simulations.

Backed by an enormous promotional budget of roughly $10 million that helped make What We Owe the Future a bestseller, MacAskill’s book aims to make the case for longtermism. Major media outlets like The New Yorker and The Guardian have reported on the movement, and MacAskill recently appeared on The Daily Show with Trevor Noah. Longtermism’s ideology is gaining visibility among the general public and has already infiltrated the tech industry, governments, and universities. Tech billionaires like Elon Musk, who described longtermism as “a close match for my own philosophy,” have touted the book, and a recent article in the UN Dispatch noted that “the foreign policy community in general and the United Nations in particular are beginning to embrace longtermism.” So it’s important to understand what this ideology is, what its priorities are, and how it could be dangerous.

…click on the above link to read the rest…

The Dangerous Ideas of “Longtermism” and “Existential Risk”

So-called rationalists have created a disturbing secular religion that looks like it addresses humanity’s deepest problems, but actually justifies pursuing the social preferences of elites.

In a late-2020 interview with CNBC, Skype cofounder Jaan Tallinn made a perplexing statement. “Climate change,” he said, “is not going to be an existential risk unless there’s a runaway scenario.” A “runaway scenario” would occur if crossing one or more critical thresholds in the climate system causes Earth’s thermostat to rise uncontrollably. The hotter it has become, the hotter it will become, via self-amplifying processes. This is probably what happened a few billion years ago on our planetary neighbor Venus, a hellish cauldron whose average surface temperature is high enough to melt lead and zinc.

Fortunately, the best science today suggests that a runaway scenario is unlikely, although not impossible. Yet even without a runaway scenario, the best science also frighteningly affirms that climate change will have devastating consequences. It will precipitate lethal heatwaves, megadroughts, catastrophic wildfires (like those seen recently in the Western U.S.), desertification, sea-level rise, mass migrations, widespread political instability, food-supply disruptions/famines, extreme weather events (more dangerous hurricanes and flash floods), infectious disease outbreaks, biodiversity loss, mass extinctions, ecological collapse, socioeconomic upheaval, terrorism and wars, etc. To quote an ominous 2020 paper co-signed by more than 11,000 scientists from around the world, “planet Earth is facing a climate emergency” that, unless immediate and drastic action is taken, will bring about “untold suffering.”

So why does Tallinn think that climate change isn’t an existential risk? Intuitively, if anything should count as an existential risk it’s climate change, right?

Cynical readers might suspect that, given Tallinn’s immense fortune of an estimated $900 million, this might be just another case of a super-wealthy tech guy dismissing or minimizing threats that probably won’t directly harm him personally. Despite being disproportionately responsible for the climate catastrophe, the super-rich will be the least affected by it. Peter Thiel—the libertarian who voted for a climate-denier in 2016—has his “apocalypse retreat” in New Zealand, Richard Branson owns his own hurricane-proof island, Jeff Bezos bought some 400,000 acres in Texas, and Elon Musk wants to move to Mars. Astoundingly, Reid Hoffman, the multi-billionaire who cofounded LinkedIn, reports that “more than 50 percent of Silicon Valley’s billionaires have bought some level of ‘apocalypse insurance,’ such as an underground bunker.”

That’s one possibility, for sure. But I think there’s a deeper reason for Tallinn’s comments. It concerns an increasingly influential moral worldview called longtermism. This has roots in the work of philosopher Nick Bostrom, who coined the term “existential risk” in 2002 and, three years later, founded the Future of Humanity Institute (FHI) based at the University of Oxford, which has received large sums of money from both Tallinn and Musk. Over the past decade, “longtermism” has become one of the main ideas promoted by the “Effective Altruism” (EA) movement, which generated controversy in the past for encouraging young people to work for Wall Street and petrochemical companies in order to donate part of their income to charity, an idea called “earn to give.” According to the longtermist Benjamin Todd, formerly at Oxford University, “longtermism might well turn out to be one of the most important discoveries of effective altruism so far.”

…click on the above link to read the rest of the article…

Olduvai IV: Courage
Click on image to read excerpts

Olduvai II: Exodus
Click on image to purchase

Click on image to purchase @ FriesenPress