搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
Apple recently announced plans to use a tool designed to identify known child sexual images on iPhones.
The decision was praised by child protection groups. But some privacy activists1 and security researchers have raised concerns. They warn that the system could be misused2 to search for other kinds of information or be used by governments to watch citizens.
How does it work?
Apple says the tool, called "NeuralHash," will scan all images kept on the device that are sent to iCloud, the company's online storage system. iPhone users can choose in their settings whether to send photos to iCloud or have them remain on the device. If the images are not sent to iCloud, Apple says they will not be scanned by the new tool.
The system searches for photos included in a database of known child sexual abuse images collected by law enforcement. Apple's scanning system will change the images into a "hash." This is a numerical piece of data that can identify the images but cannot be used to recreate them. This hash will be uploaded and compared against the law enforcement image database.
If the system matches an image with one in the database, it will be examined by a human. If the person confirms the image as a match, the device user's account will be locked and the National Center for Missing and Exploited Children (NCMEC) will be contacted.
The system is designed to only identify images already included in the existing database. Apple says parents taking innocent photos of unclothed children need not worry about such images being identified.
Concerns about possible abuse
Some security researchers have criticized the way NeuralHash "sees" the images and say the system could be used for dangerous purposes.
Matthew Green is a top cryptography researcher at Johns Hopkins University. He told the Associated Press that he fears the system could be used to accuse innocent people. It could send users images that seem harmless but that the system would report as child sexual material. Green said researchers have been able to easily fool similar systems in the past.
Another possible abuse could be a government seeking to watch dissidents or protesters. "What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology won't say no."
In an online explanation of its system, Apple said it "will refuse any such (government) demands."
Apple has been under pressure from governments and law enforcement to permit increased observation of data that it encrypts on its devices. The company said its new tool was designed to operate "with user privacy in mind." It also claimed the system was built to reduce the chance of misidentification to one in one trillion each year.
However, some privacy researchers said the system represents a clear change for a company that has been praised for its leadership on privacy and security.
In a joint3 statement, India McKinney and Erica Portnoy of the Electronic Frontier Foundation warned that Apple's new tool "opens a backdoor to your private life." The two noted4 that it may be impossible for outside researchers to confirm whether Apple is operating the system as promised.
Apple's system was also criticized by former U.S. National Security Agency contractor5 Edward Snowden. Snowden lives in exile because he is wanted in the U.S. on spying charges linked to his release of information on secret government programs for gathering6 intelligence.
He tweeted that with the new tool, Apple was offering "mass surveillance to the entire world." Snowden added: "Make no mistake, if they can scan for kiddie porn today, they can scan for anything tomorrow."
Separately, Apple announced it was adding new tools to warn children and parents when sexually explicit7 images are received or sent. This system is designed to identify and blur8 such images and warn children and parents about the content. Apple says the tool will only work for messages in child accounts registered in the company's Family Sharing system.
Apple said the changes will come out later this year with new releases of its device operating systems.
Words in This Story
scan – v. to look at (something) carefully usually in order to find someone or something
match – n. a person or thing that is equal to another
cryptography – n. the use of special codes to keep information safe in computer networks
encrypt – v. to change (information) from one form to another especially to hide its meaning
surveillance – n. the act of carefully watching activities of people especially in order to control crime or the spread of disease
porn (pornography)– n. movies, pictures, magazines, etc., that show or describe naked people or sex in an open and direct way in order to cause sexual excitement
explicit – adj. showing or talking about sex or violence in a very detailed9 way
blur – v. to make (something) unclear or difficult to see or remember
1 activists | |
n.(政治活动的)积极分子,活动家( activist的名词复数 ) | |
参考例句: |
|
|
2 misused | |
v.使用…不当( misuse的过去式和过去分词 );把…派作不正当的用途;虐待;滥用 | |
参考例句: |
|
|
3 joint | |
adj.联合的,共同的;n.关节,接合处;v.连接,贴合 | |
参考例句: |
|
|
4 noted | |
adj.著名的,知名的 | |
参考例句: |
|
|
5 contractor | |
n.订约人,承包人,收缩肌 | |
参考例句: |
|
|
6 gathering | |
n.集会,聚会,聚集 | |
参考例句: |
|
|
7 explicit | |
adj.详述的,明确的;坦率的;显然的 | |
参考例句: |
|
|
8 blur | |
n.模糊不清的事物;vt.使模糊,使看不清楚 | |
参考例句: |
|
|
9 detailed | |
adj.详细的,详尽的,极注意细节的,完全的 | |
参考例句: |
|
|
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。