搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
MICHEL MARTIN, HOST:
When Apple announced the new iPhone can use facial recognition technology to unlock the device, the response may not have been what Apple had hoped for. The feature immediately raised privacy and security concerns. To hear more about that, we're joined now by Clare Garvie. She's an associate at the Center on Privacy and Technology at Georgetown Law Center and co-author of "The Perpetual Line Up: Unregulated Police Face Recognition In America." She's with us now in our studios in Washington, D.C. Clare Garvie, thanks so much for joining us.
CLARE GARVIE: Thank you for having me on.
MARTIN: So lay out the privacy and security concerns for us. It sounds - I mean, the technology, first, if you think about it, sounds really cool. So what's the concern?
GARVIE: That's right. The technology is both convenient and it's really cool. And, frankly1, I don't see too many privacy and security concerns with the way Apple has chosen to deploy2 face recognition. What I'm far more concerned about is as face recognition becomes normalized, as it becomes something that we use on an hour to hour basis to send an animated3 emoji, to check the weather, to send a text, what's going to happen is we get very comfortable with it. And we forget that it's used by any number of actors in ways we may not know about that is both less accurate and more privacy concerning than the way that Apple has chosen to use it.
MARTIN: Well, give us the worst-case scenario4. Give us some scenarios5 that would cause concern.
GARVIE: So right now happening in Russia, face recognition has been used to scan anti-government or anti-corruption protests, identify and then publicly name the people at those anti-government protests. What this means is these people will be subject to intimidation6, if not arrest, for their political beliefs. Now, before someone says, well, wait, that's Russia. Why should we in the U.S. care about that?
The fact remains7 in the U.S., it's very much a rules-free zone when it comes to face recognition. Law enforcement across the country use this technology in various ways without any laws governing its use. Evidence suggests that it was used on protesters after the death of Freddie Gray in police custody8. It looks like face recognition was used on social media posts that protesters were posting from demonstration9 sites.
So the law enforcement agents on the ground could, in almost real time, get the identities, the names of the people at those protests. We're a country where we do not necessarily need to show our papers every time we walk down the street. If law enforcement demands our identity, we don't necessarily need to give it. And yet, our faces - now, something we have to present in public - have now done that work for us.
MARTIN: Sounds to me that your concern isn't so much this particular technology but that - what? - that it opens the door to a broader use? Is that really Apple's fault or responsibility?
GARVIE: I don't believe it's Apple's fault. And I think Apple has thought very, very carefully about a number of the security concerns. They have chosen to store the face template, if you will, locally on the phone, which means that it's a lot more secure against being hacked10 and being stolen. The real concern is that, as face recognition becomes normalized, we may stop worrying about the very real concerns that we should be worrying about as we increasingly are subjected to face recognition that we can't opt11 out of.
MARTIN: That's Clare Garvie. She is an associate at the Center on Privacy and Technology at Georgetown University's Law Center. She was kind enough to join us at our studios in Washington, D.C. Clare Garvie, thanks so much for speaking with us.
GARVIE: Thanks for having me on.
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。