在线英语听力室

【英语语言学习】我们能阻止世界末日的到来吗

时间:2016-10-11 02:35:56

搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。

(单词翻译)

Ten years ago, I wrote a book which I entitled "Our Final Century?" Question mark. My publishers cut out the question mark. (Laughter) The American publishers changed our title to "Our Final Hour." Americans like instant gratification and the reverse. (Laughter)
And my theme was this: Our Earth has existed for 45 million centuries, but this one is special — it's the first where one species, ours, has the planet's future in its hands. Over nearly all of Earth's history, threats have come from nature — disease, earthquakes, asteroids2 and so forth3 — but from now on, the worst dangers come from us. And it's now not just the nuclear threat; in our interconnected world, network breakdowns4 can cascade5 globally; air travel can spread pandemics worldwide within days; and social media can spread panic and rumor6 literally7 at the speed of light. We fret8 too much about minor9 hazards — improbable air crashes, carcinogens in food, low radiation doses, and so forth — but we and our political masters are in denial about catastrophic scenarios11. The worst have thankfully not yet happened. Indeed, they probably won't. But if an event is potentially devastating12, it's worth paying a substantial premium13 to safeguard against it, even if it's unlikely, just as we take out fire insurance on our house.
And as science offers greater power and promise, the downside gets scarier too. We get ever more vulnerable. Within a few decades, millions will have the capability14 to misuse15 rapidly advancing biotech, just as they misuse cybertech today. Freeman Dyson, in a TED1 Talk, foresaw that children will design and create new organisms just as routinely as his generation played with chemistry sets. Well, this may be on the science fiction fringe, but were even part of his scenario10 to come about, our ecology and even our species would surely not survive long unscathed. For instance, there are some eco-extremists who think that it would be better for the planet, for Gaia, if there were far fewer humans. What happens when such people have mastered synthetic16 biology techniques that will be widespread by 2050? And by then, other science fiction nightmares may transition to reality: dumb robots going rogue17, or a network that develops a mind of its own threatens us all.
Well, can we guard against such risks by regulation? We must surely try, but these enterprises are so competitive, so globalized, and so driven by commercial pressure, that anything that can be done will be done somewhere, whatever the regulations say. It's like the drug laws — we try to regulate, but can't. And the global village will have its village idiots, and they'll have a global range.
So as I said in my book, we'll have a bumpy18 ride through this century. There may be setbacks to our society — indeed, a 50 percent chance of a severe setback19. But are there conceivable events that could be even worse, events that could snuff out all life? When a new particle accelerator came online, some people anxiously asked, could it destroy the Earth or, even worse, rip apart the fabric20 of space? Well luckily, reassurance21 could be offered. I and others pointed22 out that nature has done the same experiments zillions of times already, via cosmic ray collisions. But scientists should surely be precautionary about experiments that generate conditions without precedent23 in the natural world. Biologists should avoid release of potentially devastating genetically24 modified pathogens.
And by the way, our special aversion to the risk of truly existential disasters depends on a philosophical25 and ethical26 question, and it's this: Consider two scenarios. Scenario A wipes out 90 percent of humanity. Scenario B wipes out 100 percent. How much worse is B than A? Some would say 10 percent worse. The body count is 10 percent higher. But I claim that B is incomparably worse. As an astronomer27, I can't believe that humans are the end of the story. It is five billion years before the sun flares28 up, and the universe may go on forever, so post-human evolution, here on Earth and far beyond, could be as prolonged as the Darwinian process that's led to us, and even more wonderful. And indeed, future evolution will happen much faster, on a technological29 timescale, not a natural selection timescale.
So we surely, in view of those immense stakes, shouldn't accept even a one in a billion risk that human extinction30 would foreclose this immense potential. Some scenarios that have been envisaged31 may indeed be science fiction, but others may be disquietingly real. It's an important maxim32 that the unfamiliar33 is not the same as the improbable, and in fact, that's why we at Cambridge University are setting up a center to study how to mitigate34 these existential risks. It seems it's worthwhile just for a few people to think about these potential disasters. And we need all the help we can get from others, because we are stewards35 of a precious pale blue dot in a vast cosmos36, a planet with 50 million centuries ahead of it. And so let's not jeopardize37 that future.
And I'd like to finish with a quote from a great scientist called Peter Medawar. I quote, "The bells that toll38 for mankind are like the bells of Alpine39 cattle. They are attached to our own necks, and it must be our fault if they do not make a tuneful and melodious40 sound."
Thank you very much.

分享到:

Error Warning!

出错了

Error page: /index.php?aid=379939&mid=3
Error infos: Got error 28 from storage engine
Error sql: select `l`.`tag`,`l`.`index`,`l`.`level_id`,`b`.`id`,`b`.`word`,`b`.`spell`,`b`.`explain`,`b`.`sentence`,`b`.`src` from `new_wordtaglist` `l` left join `new_word_base` `b` on `l`.`tag`=`b`.`word` where `l`.`arc_id`='379939' and `l`.`level_id`>='' group by `b`.`word` order by `l`.`index` asc

本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。