搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
New technology is making it possible for people to make realistic videos of people appearing to say things they never actually said.
The videos are known as “deepfakes.”
The name comes from the process of deep learning, a form of artificial intelligence, or AI. They can be created with computer programs available for sale or on the internet. The technology uses face mapping and AI to produce false videos that look almost real.
Lawmakers, intelligence officials and media experts have expressed concern about deepfakes. They warn the false videos could be used to threaten America’s national security or interfere1 in elections.
The videos are created by loading a complex set of instructions into a computer, along with lots of images and audio recordings2. The computer program then learns how to copy the person’s facial expressions, movements, voice and speaking patterns.
Technical experts say, with enough video and audio of a person, the system can produce fake video of the person saying anything.
Florida Senator Marco Rubio is one of several U.S. lawmakers warning of the dangers of such technology. He says that, so far, deepfakes have mainly been used against famous people or to create humorous videos. But he says he can imagine foreign nations finding ways to use them to harm American democracy and society.
Rubio told the Associated Press that a foreign intelligence service could use the technology. Such an actor could produce a fake video of a politician using racist3 language or doing something illegal. Or they could produce fake video of a U.S. soldier killing4 civilians5 overseas or a foreign leader threatening nuclear war.
“It’s a weapon that could be used - timed appropriately and placed appropriately - in the same way fake news is used,” Rubio said. But in video form, such news could create distrust and chaos6 before an election or any other major U.S. decision, he added.
“We know there are people out there that are trying to divide society, influence elections, and we know this capacity exists. So it's only logical that at some point someone's going to take the next step and sort of weaponize it.”
The issue got attention earlier this year when the website BuzzFeed published a deepfake political video. The false video appeared to show former President Barack Obama giving an address that criticized President Donald Trump7.
You see, I would never say these things, at least not in a public address. But, someone else would.
It was created using a combination of professional and free video editing programs using machine learning.
Hany Farid is a digital forensics expert at Dartmouth College in Hanover, New Hampshire. He agrees there is a great possibility that deepfakes will be used to try to influence America’s politics. “I expect that here in the United States we will start to see this content in the upcoming midterms and national election, two years from now.”
The problem, Farid says, is that it will be very easy for almost anybody to create a realistic-looking fake video of world leaders. “We have entered a new world where it is going to be difficult to know how to believe what we see,” he told the Associated Press.
He added that the opposite result is also worrying. People will become so used to seeing false videos that they will be more likely to doubt a real video. Farid expects the problem to spread worldwide.
The U.S. Defense9 Advanced Research Projects Agency (DARPA) is already working to develop technologies to identify fake images and videos. But Senator Rubio says, currently, the identification process is complex and takes a very long time.
“It takes some real forensic8 capability10, technical capabilities11, to be able to show that it's not real. And by the time that's done, it's been widely disseminated13.”
Rubio and other lawmakers say people will need to take more responsibility to identify fakes.
I’m Bryan Lynn.
Words in This Story
artificial intelligence – n. ability of a machine to reproduce human behavior
fake – adj. not real, false
appropriately – adv. suitable or right for a particular situation or person
chaos – n. a situation where there is no order at all and everyone is confused
logical – adj. using reasoning
forensics – n. scientific methods for examining objects or substances related to a crime
disseminate12 – v. to spread or give out news, information, ideas, etc. to many people
1 interfere | |
v.(in)干涉,干预;(with)妨碍,打扰 | |
参考例句: |
|
|
2 recordings | |
n.记录( recording的名词复数 );录音;录像;唱片 | |
参考例句: |
|
|
3 racist | |
n.种族主义者,种族主义分子 | |
参考例句: |
|
|
4 killing | |
n.巨额利润;突然赚大钱,发大财 | |
参考例句: |
|
|
5 civilians | |
平民,百姓( civilian的名词复数 ); 老百姓 | |
参考例句: |
|
|
6 chaos | |
n.混乱,无秩序 | |
参考例句: |
|
|
7 trump | |
n.王牌,法宝;v.打出王牌,吹喇叭 | |
参考例句: |
|
|
8 forensic | |
adj.法庭的,雄辩的 | |
参考例句: |
|
|
9 defense | |
n.防御,保卫;[pl.]防务工事;辩护,答辩 | |
参考例句: |
|
|
10 capability | |
n.能力;才能;(pl)可发展的能力或特性等 | |
参考例句: |
|
|
11 capabilities | |
n.能力( capability的名词复数 );可能;容量;[复数]潜在能力 | |
参考例句: |
|
|
12 disseminate | |
v.散布;传播 | |
参考例句: |
|
|
13 disseminated | |
散布,传播( disseminate的过去式和过去分词 ) | |
参考例句: |
|
|
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。