搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
STEVE INSKEEP, HOST:
Sheryl Sandberg, one of the top executives of Facebook, says she had to come to grips with the reality - a building full of social media users in Moscow linked to the Russian government spread election disinformation in the United States using Facebook.
SHERYL SANDBERG: In 2016, the Russian Internet Research Association interfered1 in the election on our platform, and that was something we should have caught, we should have known about. We didn't. Now we've learned.
INSKEEP: That learning is what Sandberg wants to emphasize now. Over the past year, Facebook seemed to downplay fake news on the platform. Then it had to acknowledge Russian trolls made significant use of Facebook. Next week, the company faces congressional hearings about sharing its users' data and more. And, as we met Sheryl Sandberg in the Facebook headquarters building, the 2018 elections loomed2. Facebook said for years it was not a publisher, just a platform not entirely3 responsible for what billions of people post there, no matter how deceptive4 it may be. Disasters of recent years have forced the company to shift its approach somewhat. And when we referred to the company as a publisher, Sheryl Sandberg did not question it.
What do you think your company's role is as a publisher in this year's election and in the presidential election that's coming in a few years?
SANDBERG: Well, we certainly know that people want accurate information, not false news on Facebook. And we take that really seriously, and we just want to make sure that there's no foreign interference. We are also really taking very aggressive steps on ads transparency.
INSKEEP: The company says it will disclose who pays for political ads on Facebook.
SANDBERG: We're also building an archive of political ads that will run forward and build for four years so you'll always have, once it builds up, four years of data where, for any political ad, you'll be able to see who ran it, who paid for it, how much they spent and the demographics of who saw it. Again, industry-leading transparency.
INSKEEP: Because it's clear to you that in 2016 it's hard for anybody to know. Or, it was hard at the time for anybody to know just how money was being spent and by whom.
SANDBERG: Well, this hasn't happened in our industry. And that's why, again, we're not waiting for the regulation to happen to do this. We're doing it because we think that transparency is really important.
INSKEEP: Since the 2016 election, Facebook has taken steps to deemphasize news shared by media companies. Articles shared by your friends get more prominence5. Now it plans more steps. News organizations widely rated as credible6 will get more play while those deemed not so credible will get less. Outside fact-checkers will help to examine articles, and users will be warned when they try to share doubtful ones.
Are you comfortable being the censor7, which is effectively what you would have to be, wouldn't it?
SANDBERG: We're trying to have very good community standards. We're open about what those community standards all around the world, and we're going to get increasingly open about this. We want to make sure people understand, you know, there's no place for terrorism. There's no place for hate. There's no place for bullying8. We don't sell your data, ever. We don't give your information to advertisers. You're not allowed to put, you know, hate content on our site. With news, we rely on third parties. We don't believe we can be the world's fact-checkers, but that doesn't mean we don't have a big responsibility.
INSKEEP: A company that aspired9 to connect the world has begun to face demands that it occasionally break the connection.
You probably know that there was a leaked memo10 from 2016 from a Facebook executive who said we care so much about connecting people that even if we connected people who used our platforms to coordinate11 a terrorist attack, we're fine with that because we're still just connecting people.
SANDBERG: Right. So...
INSKEEP: That was 2016. Do you still believe that?
SANDBERG: We never believed that. The person who wrote it, named Boz, never believed it. He's a provocative12 guy and was trying to spark debate. But Mark never believed it. I never believed it. So terrorism...
INSKEEP: So maybe it was hyperbole. But he was leaning in the way that he did believe, that maybe you cared too much about this...
SANDBERG: Well, let's go to the example.
INSKEEP: ...Too little about other things.
SANDBERG: Let's go to the example.
INSKEEP: Sure.
SANDBERG: There's no place for terrorism on our platform. We've worked really hard on this. Ninety-nine percent of the ISIS content we're able to take down now we find before it's even posted. We've worked very closely with law enforcement all across the world to make sure there is no terrorism content on our site, and that's something we care about very deeply.
INSKEEP: But what about the broader point? Essentially13 he was saying the company's values are out of whack14 - we're interested in one really big important thing, perhaps to the exclusion15 of other things.
SANDBERG: Again, that memo is wrong, and he said he didn't mean it. And Mark and I certainly never agreed. We never only cared about one thing. We cared about social sharing, and we cared about privacy. That's why we put the controls in place. I think the balance was off because we didn't foresee as many bad use cases, and that balance has shifted and shifted hard now.
INSKEEP: That's part of our talk with Sheryl Sandberg of Facebook. She's talking to people like us in part to prepare the ground for an event next week. Her boss, Mark Zuckerberg, takes questions before Congress. And NPR congressional reporter Kelsey Snell is with us. Hey there, Kelsey.
KELSEY SNELL, BYLINE16: Hi there.
INSKEEP: What do lawmakers want to know?
SNELL: Well, they want to know a lot because they have been asking for Mark Zuckerberg to come and testify before Congress for a long time - for years, in fact. And he has put that off, and they have sent other people, other representatives from Facebook. But there will be a lot of pent-up energy and a lot of pent-up questions for Zuckerberg, not just about Cambridge Analytica and the security situation, but about Facebook's role and social media's role in data security and the way people's information is shared.
INSKEEP: But let me ask what the point is, Kelsey, because when we were talking with Sheryl Sandberg, one of the things she said in the full interview is there's not really very much regulatory activity going on in Congress. There's only one piece of legislation that she even knew about that seemed mildly significant. Are lawmakers actually considering anything that would in any way rein-in or regulate Facebook?
SNELL: Even some Democrats17, who are more open to the idea of regulation, say that it would be hard in this environment to pass any new legislation that regulates Facebook or other social media sites. But I think it's interesting what we heard her say there about voluntary transparency. That is a way to stave off any inklings of regulation that might be brewing18 in Congress, and it kind of sets up a situation where Congress may not want to crack down now. But these things take time. Hearings traditionally are the start, not the end, of something in Congress. So it's kind of this moment where Congress is acknowledging a national conversation, stepping in, saying that they're paying attention. But we may not see them actually respond with legislation or with any real action for some time.
INSKEEP: How significant is Facebook's promise to be more transparent19 about who is paying for political ads? I mean, I'm asking you as a political reporter, was it hard to tell who was spending money, how, in the 2016 election?
SNELL: Yeah. And this new transparency, it will give new information, but it's hard to know just from what she's saying right now how that information will be accessed, how deep the information will go. Right now we as reporters have access to a fairly in-depth research opportunity to kind of go through political ads that are, you know, that exist now. And we need to know what this will look like from them.
INSKEEP: Kelsey, thanks.
SNELL: Thank you.
INSKEEP: That's NPR's Kelsey Snell.
1 interfered | |
v.干预( interfere的过去式和过去分词 );调停;妨碍;干涉 | |
参考例句: |
|
|
2 loomed | |
v.隐约出现,阴森地逼近( loom的过去式和过去分词 );隐约出现,阴森地逼近 | |
参考例句: |
|
|
3 entirely | |
ad.全部地,完整地;完全地,彻底地 | |
参考例句: |
|
|
4 deceptive | |
adj.骗人的,造成假象的,靠不住的 | |
参考例句: |
|
|
5 prominence | |
n.突出;显著;杰出;重要 | |
参考例句: |
|
|
6 credible | |
adj.可信任的,可靠的 | |
参考例句: |
|
|
7 censor | |
n./vt.审查,审查员;删改 | |
参考例句: |
|
|
8 bullying | |
v.恐吓,威逼( bully的现在分词 );豪;跋扈 | |
参考例句: |
|
|
9 aspired | |
v.渴望,追求( aspire的过去式和过去分词 ) | |
参考例句: |
|
|
10 memo | |
n.照会,备忘录;便笺;通知书;规章 | |
参考例句: |
|
|
11 coordinate | |
adj.同等的,协调的;n.同等者;vt.协作,协调 | |
参考例句: |
|
|
12 provocative | |
adj.挑衅的,煽动的,刺激的,挑逗的 | |
参考例句: |
|
|
13 essentially | |
adv.本质上,实质上,基本上 | |
参考例句: |
|
|
14 whack | |
v.敲击,重打,瓜分;n.重击,重打,尝试,一份 | |
参考例句: |
|
|
15 exclusion | |
n.拒绝,排除,排斥,远足,远途旅行 | |
参考例句: |
|
|
16 byline | |
n.署名;v.署名 | |
参考例句: |
|
|
17 democrats | |
n.民主主义者,民主人士( democrat的名词复数 ) | |
参考例句: |
|
|
18 brewing | |
n. 酿造, 一次酿造的量 动词brew的现在分词形式 | |
参考例句: |
|
|
19 transparent | |
adj.明显的,无疑的;透明的 | |
参考例句: |
|
|
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。