新东方网>英语>英语学习>英语阅读>双语新闻>科技IT>正文

《终结者》中机器人大战:解读科技发达的背后

2015-08-12 11:11

来源:21英语

作者:

  科技日新月异,机器人技术的迅速发展给我们的生活带来了诸多便利。不过,如果机器人成为了《终结者》中的战争机器,人类有能力应对吗?

  It sounds like a science-fiction nightmare. But “killer robots” havethe likes of British scientist Stephen Hawking and Apple co-founder Steve Wozniak fretting and warning the machines couldfuel ethnic cleansings and an arms race.

  机器人变杀手,听上去很像科幻小说中的情节,但是英国科学家霍金以及苹果联合创始人沃兹尼亚克都对此忧心忡忡。他们警告世人:这样的机器可能会引发种族清洗和军备竞赛。

  Autonomous weapons, which use artificial intelligence to select targets without human intervention,have been described as “the third revolution in warfare, after gunpowder and nuclear arms,” about1,000 tech bigwigs wrote in an open letter on July 28.

  7月28日,大约1000名科技界的大人物联名签署公开信,信中表示:自主武器可以使用人工智能选择目标,不需要人力介入,这样的技术也被形容为“继火药和核武器之后的第三次战争革命”。

  Unlike drones, which require a human hand in their action, this kind of robot would have someautonomous decision-making abilities and the capacity to act on its own authority.

  无人机还需要人来操控其行动,杀手机器人与其不同的是,他们某种程度上拥有自主决策能力,以及自我行动的能力。

  "The key question for humanity today is whether to start a global AI (artificial intelligence) armsrace or to prevent it from starting," they wrote.

  科学家在信中表示,“当今人性的关键问题在于,是去开启一次全球的人工智能军备竞赛,还是将这样的趋势扼杀在摇篮中。”

  "If any major military power pushes ahead with AI weapon development, a global arms rac isvirtually inevitable," said the letter released at the opening of the 2015 International JointConference on Artificial Intelligence in Buenos Aires.

  2015年的国际人工智能联合大会在布宜诺斯艾利斯举行,这封信在大会的开幕式上公布:“如果有军事力量开始推动人工智能武器发展,全球军备竞赛将不可避免”。

  The idea of an automated killing machine – made famous by Arnold Schwarzenegger’s Terminator– is moving swiftly from science fiction to reality, according to the scientists.

  自主化杀人机器的理念,从施瓦辛格的《终结者》电影开始为人熟知。而科学家们认为,这一概念正从科幻小说中进入到现实世界。

  "The deployment of such systems is – practically if not legally – feasible within years, not decades,"the letter said.

  “部署这类系统——特别是非法部署——可以在短短数年时间内完成,不需要几十年的时间。”

  Lower bar for entry

  门槛低

  The development of such weapons, while potentially reducing the extent of battlefield casualties,might also lower the threshold for going to battle, noted the scientists.

  科学家表示,这类武器的发展,有可能降低战场伤亡,但同时也可能降低了战争爆发的门槛。

  The scientists painted an apocalyptic scenario in which autonomous weapons fall into the hands ofterrorists, dictators or warlords hoping to carry out ethnic cleansings.

  而如果自主化武器落入恐怖分子、独裁者手中,或者被军阀用于种族清洗,那人类将要大难临头。

  The group concluded with an appeal for a “ban on offensive autonomous weapons beyondmeaningful human control.”

  科学家们请求“在有意义的人类控制基础上,禁止攻击性自主化武器”。

  In a 2014 BBC interview, Hawking said the development of full artificial intelligence could spell theend of the human race.

  在2014年BBC的采访中,霍金表示全方位人工智能的发展,可能会把人类推向末日。

  "It would take off on its own, and re-design itself at an ever increasing rate. Humans, who arelimited by slow biological evolution, couldn’t compete, and would be superseded,” he said.

  他表示,“他们可以自行启动,以惊人的速率重塑自身。而人类受限于缓慢的生物进化,无法与其匹敌,最终会被其取代。”

  Authorities are gradually waking up to the risk of robot wars. Last May, for the first time, the UnitedNations brought governments together to begin talks on so-called "lethal autonomous weaponssystems" that can select targets and carry out attacks without direct human intervention.

  有关部门也逐渐意识到机器人战争的危险性。去年5月,联合国第一次将众多政府部分汇聚在一起,讨论所谓“致命自主武器系统”可以在没有直接人力干预的情况下,选择目标并且实施攻击的问题。

  In 2012, the US government imposed a 10-year human control requirement on automatedweapons.

  2012年,美国政府强制规定自主武器需要10年人工控制。

  There have been examples of weapons being stopped in their infancy.

  自主武器在发展初期就被终止的例子屡见不鲜。

  After UN-backed talks, blinding laser weapons were banned in 1998, before they ever hit thebattlefield.

  1998年,经过联合国支持下的讨论,激光致盲武器在真正亮相战场之前就被禁止。

  更多精彩内容 >> 新东方网英语频道 

  全国新东方英语课程搜索


(编辑:何莹莹)



版权及免责声明

凡本网注明"稿件来源:新东方"的所有文字、图片和音视频稿件,版权均属新东方教育科技集团(含本网和新东方网) 所有,任何媒体、网站或个人未经本网协议授权不得转载、链接、转贴或以其他任何方式复制、发表。已经本网协议授权的媒体、网站,在下载使用时必须注明"稿件来源:新东方",违者本网将依法追究法律责任。

本网未注明"稿件来源:新东方"的文/图等稿件均为转载稿,本网转载仅基于传递更多信息之目的,并不意味着赞同转载稿的观点或证实其内容的真实性。如其他媒体、网站或个人从本网下载使用,必须保留本网注明的"稿件来源",并自负版权等法律责任。如擅自篡改为"稿件来源:新东方",本网将依法追究法律责任。

如本网转载稿涉及版权等问题,请作者见稿后在两周内速来电与新东方网联系,电话:010-60908555。

热搜关键词