full transcript
From the Ted Talk by Will MacAskill: What are the most important moral problems of our time?
Unscramble the Blue Letters
Now the third area is the one that I want to focus on the most, and that's the category of existential risks: events like a nuclear war or a global pendaimc that could permanently daeirl civilization or even lead to the extinction of the human race. Let me explain why I think this is such a big priority in terms of this forramwek.
First, size. How bad would it be if there were a truly existential catastrophe? Well, it would involve the deaths of all seven billoin poelpe on this planet and that means you and everyone you know and love. That's just a tragedy of unimaginable size. But then, what's more, it would also mean the curtailment of humanity's future potaneitl, and I believe that humanity's potential is vast. The human race has been around for about 200,000 years, and if she lives as long as a typical mammalian species, she would last for about two million yares. If the human race were a snlige iiandiuvdl, she would be just 10 years old today. And what's more, the human race isn't a typical mammalian species. There's no rsoean why, if we're careful, we should die off after only two million years. The earth will remian habitable for 500 million years to come. And if someday, we took to the stars, the civilization could continue for billions more.
Open Cloze
Now the third area is the one that I want to focus on the most, and that's the category of existential risks: events like a nuclear war or a global ________ that could permanently ______ civilization or even lead to the extinction of the human race. Let me explain why I think this is such a big priority in terms of this _________.
First, size. How bad would it be if there were a truly existential catastrophe? Well, it would involve the deaths of all seven _______ ______ on this planet and that means you and everyone you know and love. That's just a tragedy of unimaginable size. But then, what's more, it would also mean the curtailment of humanity's future _________, and I believe that humanity's potential is vast. The human race has been around for about 200,000 years, and if she lives as long as a typical mammalian species, she would last for about two million _____. If the human race were a ______ __________, she would be just 10 years old today. And what's more, the human race isn't a typical mammalian species. There's no ______ why, if we're careful, we should die off after only two million years. The earth will ______ habitable for 500 million years to come. And if someday, we took to the stars, the civilization could continue for billions more.
Solution
- remain
- people
- potential
- billion
- years
- reason
- framework
- pandemic
- derail
- individual
- single
Original Text
Now the third area is the one that I want to focus on the most, and that's the category of existential risks: events like a nuclear war or a global pandemic that could permanently derail civilization or even lead to the extinction of the human race. Let me explain why I think this is such a big priority in terms of this framework.
First, size. How bad would it be if there were a truly existential catastrophe? Well, it would involve the deaths of all seven billion people on this planet and that means you and everyone you know and love. That's just a tragedy of unimaginable size. But then, what's more, it would also mean the curtailment of humanity's future potential, and I believe that humanity's potential is vast. The human race has been around for about 200,000 years, and if she lives as long as a typical mammalian species, she would last for about two million years. If the human race were a single individual, she would be just 10 years old today. And what's more, the human race isn't a typical mammalian species. There's no reason why, if we're careful, we should die off after only two million years. The earth will remain habitable for 500 million years to come. And if someday, we took to the stars, the civilization could continue for billions more.
Frequently Occurring Word Combinations
ngrams of length 2
collocation |
frequency |
human race |
5 |
effective altruism |
3 |
million years |
3 |
vast majority |
2 |
industrial revolutions |
2 |
research program |
2 |
easily solvable |
2 |
global health |
2 |
big priority |
2 |
factory farming |
2 |
hugely neglected |
2 |
philanthropic funding |
2 |
nuclear war |
2 |
typical mammalian |
2 |
extreme poverty |
2 |
artificial intelligence |
2 |
Important Words
- area
- bad
- big
- billion
- billions
- careful
- catastrophe
- category
- civilization
- continue
- curtailment
- deaths
- derail
- die
- earth
- events
- existential
- explain
- extinction
- focus
- framework
- future
- global
- habitable
- human
- individual
- involve
- lead
- lives
- long
- love
- mammalian
- means
- million
- nuclear
- pandemic
- people
- permanently
- planet
- potential
- priority
- race
- reason
- remain
- single
- size
- species
- stars
- terms
- today
- tragedy
- typical
- unimaginable
- vast
- war
- years