英语 英语 日语 日语 韩语 韩语 法语 法语 德语 德语 西班牙语 西班牙语 意大利语 意大利语 阿拉伯语 阿拉伯语 葡萄牙语 葡萄牙语 越南语 越南语 俄语 俄语 芬兰语 芬兰语 泰语 泰语 泰语 丹麦语 泰语 对外汉语

【英语语言学习】隐私

时间:2016-09-28 06:41来源:互联网 提供网友:yajing   字体: [ ]
特别声明:本栏目内容均从网络收集或者网友提供,供仅参考试用,我们无法保证内容完整和正确。如果资料损害了您的权益,请与站长联系,我们将及时删除并致以歉意。
    (单词翻译:双击或拖选)

 I would like to tell you a story connecting the notorious privacy incident involving Adam and Eve, and the remarkable1 shift in the boundaries between public and private which has occurred in the past 10 years.

 
You know the incident. Adam and Eve one day in the Garden of Eden realize they are naked. They freak out. And the rest is history.
 
Nowadays, Adam and Eve would probably act differently.
 
[@Adam Last nite was a blast! loved dat apple LOL]
 
[@Eve yep.. babe, know what happened to my pants tho?]
 
We do reveal so much more information about ourselves online than ever before, and so much information about us is being collected by organizations. Now there is much to gain and benefit from this massive analysis of personal information, or big data, but there are also complex tradeoffs that come from giving away our privacy. And my story is about these tradeoffs.
 
We start with an observation which, in my mind, has become clearer and clearer in the past few years, that any personal information can become sensitive information. Back in the year 2000, about 100 billion photos were shot worldwide, but only a minuscule2 proportion of them were actually uploaded online. In 2010, only on Facebook, in a single month, 2.5 billion photos were uploaded, most of them identified. In the same span of time, computers' ability to recognize people in photos improved by three orders of magnitude. What happens when you combine these technologies together: increasing availability of facial data; improving facial recognizing ability by computers; but also cloud computing3, which gives anyone in this theater the kind of computational power which a few years ago was only the domain4 of three-letter agencies; and ubiquitous computing, which allows my phone, which is not a supercomputer, to connect to the Internet and do there hundreds of thousands of face metrics in a few seconds? Well, we conjecture5 that the result of this combination of technologies will be a radical6 change in our very notions of privacy and anonymity7.
 
To test that, we did an experiment on Carnegie Mellon University campus. We asked students who were walking by to participate in a study, and we took a shot with a webcam, and we asked them to fill out a survey on a laptop. While they were filling out the survey, we uploaded their shot to a cloud-computing cluster, and we started using a facial recognizer to match that shot to a database of some hundreds of thousands of images which we had downloaded from Facebook profiles. By the time the subject reached the last page on the survey, the page had been dynamically updated with the 10 best matching photos which the recognizer had found, and we asked the subjects to indicate whether he or she found themselves in the photo.
 
Do you see the subject? Well, the computer did, and in fact did so for one out of three subjects.
 
So essentially8, we can start from an anonymous9 face, offline or online, and we can use facial recognition to give a name to that anonymous face thanks to social media data. But a few years back, we did something else. We started from social media data, we combined it statistically10 with data from U.S. government social security, and we ended up predicting social security numbers, which in the United States are extremely sensitive information.
 
Do you see where I'm going with this? So if you combine the two studies together, then the question becomes, can you start from a face and, using facial recognition, find a name and publicly available information about that name and that person, and from that publicly available information infer non-publicly available information, much more sensitive ones which you link back to the face? And the answer is, yes, we can, and we did. Of course, the accuracy keeps getting worse. [27% of subjects' first 5 SSN digits11 identified (with 4 attempts)] But in fact, we even decided12 to develop an iPhone app which uses the phone's internal camera to take a shot of a subject and then upload it to a cloud and then do what I just described to you in real time: looking for a match, finding public information, trying to infer sensitive information, and then sending back to the phone so that it is overlaid on the face of the subject, an example of augmented13 reality, probably a creepy example of augmented reality. In fact, we didn't develop the app to make it available, just as a proof of concept.
 
In fact, take these technologies and push them to their logical extreme. Imagine a future in which strangers around you will look at you through their Google Glasses or, one day, their contact lenses, and use seven or eight data points about you to infer anything else which may be known about you. What will this future without secrets look like? And should we care?
 
We may like to believe that the future with so much wealth of data would be a future with no more biases14, but in fact, having so much information doesn't mean that we will make decisions which are more objective. In another experiment, we presented to our subjects information about a potential job candidate. We included in this information some references to some funny, absolutely legal, but perhaps slightly embarrassing information that the subject had posted online. Now interestingly, among our subjects, some had posted comparable information, and some had not. Which group do you think was more likely to judge harshly our subject? Paradoxically, it was the group who had posted similar information, an example of moral dissonance.
 
Now you may be thinking, this does not apply to me, because I have nothing to hide. But in fact, privacy is not about having something negative to hide. Imagine that you are the H.R. director of a certain organization, and you receive résumés, and you decide to find more information about the candidates. Therefore, you Google their names and in a certain universe, you find this information. Or in a parallel universe, you find this information. Do you think that you would be equally likely to call either candidate for an interview? If you think so, then you are not like the U.S. employers who are, in fact, part of our experiment, meaning we did exactly that. We created Facebook profiles, manipulating traits, then we started sending out résumés to companies in the U.S., and we detected, we monitored, whether they were searching for our candidates, and whether they were acting15 on the information they found on social media. And they were. Discrimination was happening through social media for equally skilled candidates.
 
Now marketers like us to believe that all information about us will always be used in a manner which is in our favor. But think again. Why should that be always the case? In a movie which came out a few years ago, "Minority Report," a famous scene had Tom Cruise walk in a mall and holographic personalized advertising16 would appear around him. Now, that movie is set in 2054, about 40 years from now, and as exciting as that technology looks, it already vastly underestimates the amount of information that organizations can gather about you, and how they can use it to influence you in a way that you will not even detect.
 
So as an example, this is another experiment actually we are running, not yet completed. Imagine that an organization has access to your list of Facebook friends, and through some kind of algorithm they can detect the two friends that you like the most. And then they create, in real time, a facial composite of these two friends. Now studies prior to ours have shown that people don't recognize any longer even themselves in facial composites, but they react to those composites in a positive manner. So next time you are looking for a certain product, and there is an ad suggesting you to buy it, it will not be just a standard spokesperson. It will be one of your friends, and you will not even know that this is happening.
 
Now the problem is that the current policy mechanisms17 we have to protect ourselves from the abuses of personal information are like bringing a knife to a gunfight. One of these mechanisms is transparency, telling people what you are going to do with their data. And in principle, that's a very good thing. It's necessary, but it is not sufficient. Transparency can be misdirected. You can tell people what you are going to do, and then you still nudge them to disclose arbitrary amounts of personal information.
 
So in yet another experiment, this one with students, we asked them to provide information about their campus behavior, including pretty sensitive questions, such as this one. [Have you ever cheated in an exam?] Now to one group of subjects, we told them, "Only other students will see your answers." To another group of subjects, we told them, "Students and faculty18 will see your answers." Transparency. Notification. And sure enough, this worked, in the sense that the first group of subjects were much more likely to disclose than the second. It makes sense, right? But then we added the misdirection. We repeated the experiment with the same two groups, this time adding a delay between the time we told subjects how we would use their data and the time we actually started answering the questions.
 
How long a delay do you think we had to add in order to nullify the inhibitory effect of knowing that faculty would see your answers? Ten minutes? Five minutes? One minute? How about 15 seconds? Fifteen seconds were sufficient to have the two groups disclose the same amount of information, as if the second group now no longer cares for faculty reading their answers.
 
Now I have to admit that this talk so far may sound exceedingly gloomy, but that is not my point. In fact, I want to share with you the fact that there are alternatives. The way we are doing things now is not the only way they can done, and certainly not the best way they can be done. When someone tells you, "People don't care about privacy," consider whether the game has been designed and rigged so that they cannot care about privacy, and coming to the realization19 that these manipulations occur is already halfway20 through the process of being able to protect yourself. When someone tells you that privacy is incompatible21 with the benefits of big data, consider that in the last 20 years, researchers have created technologies to allow virtually any electronic transactions to take place in a more privacy-preserving manner. We can browse22 the Internet anonymously23. We can send emails that can only be read by the intended recipient24, not even the NSA. We can have even privacy-preserving data mining. In other words, we can have the benefits of big data while protecting privacy. Of course, these technologies imply a shifting of cost and revenues between data holders25 and data subjects, which is why, perhaps, you don't hear more about them.
 
Which brings me back to the Garden of Eden. There is a second privacy interpretation26 of the story of the Garden of Eden which doesn't have to do with the issue of Adam and Eve feeling naked and feeling ashamed. You can find echoes of this interpretation in John Milton's "Paradise Lost." In the garden, Adam and Eve are materially content. They're happy. They are satisfied. However, they also lack knowledge and self-awareness27. The moment they eat the aptly named fruit of knowledge, that's when they discover themselves. They become aware. They achieve autonomy. The price to pay, however, is leaving the garden. So privacy, in a way, is both the means and the price to pay for freedom.
 
Again, marketers tell us that big data and social media are not just a paradise of profit for them, but a Garden of Eden for the rest of us. We get free content. We get to play Angry Birds. We get targeted apps. But in fact, in a few years, organizations will know so much about us, they will be able to infer our desires before we even form them, and perhaps buy products on our behalf before we even know we need them.
 
Now there was one English author who anticipated this kind of future where we would trade away our autonomy and freedom for comfort. Even more so than George Orwell, the author is, of course, Aldous Huxley. In "Brave New World," he imagines a society where technologies that we created originally for freedom end up coercing28 us. However, in the book, he also offers us a way out of that society, similar to the path that Adam and Eve had to follow to leave the garden. In the words of the Savage29, regaining30 autonomy and freedom is possible, although the price to pay is steep. So I do believe that one of the defining fights of our times will be the fight for the control over personal information, the fight over whether big data will become a force for freedom, rather than a force which will hiddenly manipulate us.
 
Right now, many of us do not even know that the fight is going on, but it is, whether you like it or not. And at the risk of playing the serpent, I will tell you that the tools for the fight are here, the awareness of what is going on, and in your hands, just a few clicks away.
 
Thank you.

点击收听单词发音收听单词发音  

1 remarkable 8Vbx6     
adj.显著的,异常的,非凡的,值得注意的
参考例句:
  • She has made remarkable headway in her writing skills.她在写作技巧方面有了长足进步。
  • These cars are remarkable for the quietness of their engines.这些汽车因发动机没有噪音而不同凡响。
2 minuscule V76zS     
adj.非常小的;极不重要的
参考例句:
  • The human race only a minuscule portion of the earth's history.人类只有占有极小部分地球历史。
  • As things stand,Hong Kong's renminbi banking system is minuscule.就目前的情况而言,香港的人民币银行体系可谓微不足道。
3 computing tvBzxs     
n.计算
参考例句:
  • to work in computing 从事信息处理
  • Back in the dark ages of computing, in about 1980, they started a software company. 早在计算机尚未普及的时代(约1980年),他们就创办了软件公司。
4 domain ys8xC     
n.(活动等)领域,范围;领地,势力范围
参考例句:
  • This information should be in the public domain.这一消息应该为公众所知。
  • This question comes into the domain of philosophy.这一问题属于哲学范畴。
5 conjecture 3p8z4     
n./v.推测,猜测
参考例句:
  • She felt it no use to conjecture his motives.她觉得猜想他的动机是没有用的。
  • This conjecture is not supported by any real evidence.这种推测未被任何确切的证据所证实。
6 radical hA8zu     
n.激进份子,原子团,根号;adj.根本的,激进的,彻底的
参考例句:
  • The patient got a radical cure in the hospital.病人在医院得到了根治。
  • She is radical in her demands.她的要求十分偏激。
7 anonymity IMbyq     
n.the condition of being anonymous
参考例句:
  • Names of people in the book were changed to preserve anonymity. 为了姓名保密,书中的人用的都是化名。
  • Our company promises to preserve the anonymity of all its clients. 我们公司承诺不公开客户的姓名。
8 essentially nntxw     
adv.本质上,实质上,基本上
参考例句:
  • Really great men are essentially modest.真正的伟人大都很谦虚。
  • She is an essentially selfish person.她本质上是个自私自利的人。
9 anonymous lM2yp     
adj.无名的;匿名的;无特色的
参考例句:
  • Sending anonymous letters is a cowardly act.寄匿名信是懦夫的行为。
  • The author wishes to remain anonymous.作者希望姓名不公开。
10 statistically Yuxwa     
ad.根据统计数据来看,从统计学的观点来看
参考例句:
  • The sample of building permits is larger and therefore, statistically satisfying. 建筑许可数的样本比较大,所以统计数据更令人满意。
  • The results of each test would have to be statistically independent. 每次试验的结果在统计上必须是独立的。
11 digits a2aacbd15b619a9b9e5581a6c33bd2b1     
n.数字( digit的名词复数 );手指,足趾
参考例句:
  • The number 1000 contains four digits. 1000是四位数。 来自《简明英汉词典》
  • The number 410 contains three digits. 数字 410 中包括三个数目字。 来自《现代英汉综合大词典》
12 decided lvqzZd     
adj.决定了的,坚决的;明显的,明确的
参考例句:
  • This gave them a decided advantage over their opponents.这使他们比对手具有明显的优势。
  • There is a decided difference between British and Chinese way of greeting.英国人和中国人打招呼的方式有很明显的区别。
13 Augmented b45f39670f767b2c62c8d6b211cbcb1a     
adj.增音的 动词augment的过去式和过去分词形式
参考例句:
  • 'scientists won't be replaced," he claims, "but they will be augmented." 他宣称:“科学家不会被取代;相反,他们会被拓展。” 来自英汉非文学 - 科学史
  • The impact of the report was augmented by its timing. 由于发表的时间选得好,这篇报导的影响更大了。
14 biases a1eb9034f18cae637caab5279cc70546     
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹
参考例句:
  • Stereotypes represent designer or researcher biases and assumptions, rather than factual data. 它代表设计师或者研究者的偏见和假设,而不是实际的数据。 来自About Face 3交互设计精髓
  • The net effect of biases on international comparisons is easily summarized. 偏差对国际比较的基本影响容易概括。
15 acting czRzoc     
n.演戏,行为,假装;adj.代理的,临时的,演出用的
参考例句:
  • Ignore her,she's just acting.别理她,她只是假装的。
  • During the seventies,her acting career was in eclipse.在七十年代,她的表演生涯黯然失色。
16 advertising 1zjzi3     
n.广告业;广告活动 a.广告的;广告业务的
参考例句:
  • Can you give me any advice on getting into advertising? 你能指点我如何涉足广告业吗?
  • The advertising campaign is aimed primarily at young people. 这个广告宣传运动主要是针对年轻人的。
17 mechanisms d0db71d70348ef1c49f05f59097917b8     
n.机械( mechanism的名词复数 );机械装置;[生物学] 机制;机械作用
参考例句:
  • The research will provide direct insight into molecular mechanisms. 这项研究将使人能够直接地了解分子的机理。 来自《简明英汉词典》
  • He explained how the two mechanisms worked. 他解释这两台机械装置是如何工作的。 来自《简明英汉词典》
18 faculty HhkzK     
n.才能;学院,系;(学院或系的)全体教学人员
参考例句:
  • He has a great faculty for learning foreign languages.他有学习外语的天赋。
  • He has the faculty of saying the right thing at the right time.他有在恰当的时候说恰当的话的才智。
19 realization nTwxS     
n.实现;认识到,深刻了解
参考例句:
  • We shall gladly lend every effort in our power toward its realization.我们将乐意为它的实现而竭尽全力。
  • He came to the realization that he would never make a good teacher.他逐渐认识到自己永远不会成为好老师。
20 halfway Xrvzdq     
adj.中途的,不彻底的,部分的;adv.半路地,在中途,在半途
参考例句:
  • We had got only halfway when it began to get dark.走到半路,天就黑了。
  • In study the worst danger is give up halfway.在学习上,最忌讳的是有始无终。
21 incompatible y8oxu     
adj.不相容的,不协调的,不相配的
参考例句:
  • His plan is incompatible with my intent.他的计划与我的意图不相符。
  • Speed and safety are not necessarily incompatible.速度和安全未必不相容。
22 browse GSWye     
vi.随意翻阅,浏览;(牛、羊等)吃草
参考例句:
  • I had a browse through the books on her shelf.我浏览了一下她书架上的书。
  • It is a good idea to browse through it first.最好先通篇浏览一遍。
23 anonymously czgzOU     
ad.用匿名的方式
参考例句:
  • The manuscripts were submitted anonymously. 原稿是匿名送交的。
  • Methods A self-administered questionnaire was used to survey 536 teachers anonymously. 方法采用自编“中小学教师职业压力问卷”对536名中小学教师进行无记名调查。
24 recipient QA8zF     
a.接受的,感受性强的 n.接受者,感受者,容器
参考例句:
  • Please check that you have a valid email certificate for each recipient. 请检查是否对每个接收者都有有效的电子邮件证书。
  • Colombia is the biggest U . S aid recipient in Latin America. 哥伦比亚是美国在拉丁美洲最大的援助对象。
25 holders 79c0e3bbb1170e3018817c5f45ebf33f     
支持物( holder的名词复数 ); 持有者; (支票等)持有人; 支托(或握持)…之物
参考例句:
  • Slaves were mercilessly ground down by slave holders. 奴隶受奴隶主的残酷压迫。
  • It is recognition of compassion's part that leads the up-holders of capital punishment to accuse the abolitionists of sentimentality in being more sorry for the murderer than for his victim. 正是对怜悯的作用有了认识,才使得死刑的提倡者指控主张废除死刑的人感情用事,同情谋杀犯胜过同情受害者。
26 interpretation P5jxQ     
n.解释,说明,描述;艺术处理
参考例句:
  • His statement admits of one interpretation only.他的话只有一种解释。
  • Analysis and interpretation is a very personal thing.分析与说明是个很主观的事情。
27 awareness 4yWzdW     
n.意识,觉悟,懂事,明智
参考例句:
  • There is a general awareness that smoking is harmful.人们普遍认识到吸烟有害健康。
  • Environmental awareness has increased over the years.这些年来人们的环境意识增强了。
28 coercing ed7ef81e2951ec8e292151785438e904     
v.迫使做( coerce的现在分词 );强迫;(以武力、惩罚、威胁等手段)控制;支配
参考例句:
  • All of the children had atopic dermatis coercing at least 20% of their body surface area. 所有的患儿体表有超过20%的遗传性过敏症皮炎感染。 来自互联网
  • I assured him that we had no intention of coercing Israel in response a Soviet threat. 我向他保证,我们无意强迫以色列对苏联的威胁做出反映。 来自互联网
29 savage ECxzR     
adj.野蛮的;凶恶的,残暴的;n.未开化的人
参考例句:
  • The poor man received a savage beating from the thugs.那可怜的人遭到暴徒的痛打。
  • He has a savage temper.他脾气粗暴。
30 regaining 458e5f36daee4821aec7d05bf0dd4829     
复得( regain的现在分词 ); 赢回; 重回; 复至某地
参考例句:
  • She was regaining consciousness now, but the fear was coming with her. 现在她正在恢发她的知觉,但是恐怖也就伴随着来了。
  • She said briefly, regaining her will with a click. 她干脆地答道,又马上重新振作起精神来。
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎点击提交分享给大家。
------分隔线----------------------------
TAG标签:   英语听力  听力教程  英语学习
顶一下
(0)
0%
踩一下
(0)
0%
最新评论 查看所有评论
发表评论 查看所有评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
验证码:
听力搜索
推荐频道
论坛新贴