搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。
(单词翻译)
STEVE INSKEEP, HOST:
Facebook says it will be more open about the posts it takes down. The company tells NPR that today it is publishing internal details of its community standards. That's the term for what's allowed on Facebook and what is not. Monika Bickert is a Facebook vice1 president.
MONIKA BICKERT: So we've always had a set of community standards that the public can see that explain, for instance, no harassment2, no bullying3, no terror propaganda. But now we're actually explaining how we define those terms for our review teams and how we enforce those policies.
INSKEEP: She says users want more openness, which is an understatement. The company is under unprecedented4 pressure. It's been roiled5 by two years of questions - which news did it promote during the last election? How widely did it share users' data? - and more. Now it is revealing definitions used by internal monitors who check up on complaints about posts around the world, like, what exactly constitutes a genuine death threat? If it names a person, location or weapon, that should come down. Or what exactly amounts to hate speech?
BICKERT: Where we have drawn6 the line is that we will allow attacks or negative commentary about institutions or countries or religions, but we don't allow attacks against people. So if somebody is criticizing or attacking all members of a religion, that's where we would draw the line.
INSKEEP: I wonder if one of the gray areas there might be someone who criticizes Islam but in an extreme way that somebody might argue is inciting7 people against Muslims.
BICKERT: We do try to allow as much speech as possible about institutions, religions, countries, and we know sometimes that might make people uncomfortable. That's one of the reasons we give people a lot of choice and control over what they see on Facebook. You can unfollow pages, you can unfollow people, and you can block people that you don't want to communicate with.
INSKEEP: How are you thinking about the environment as the 2018 election approaches and, of course, there will once again be lots of political speech on Facebook?
BICKERT: Well, we know there are a lot of very serious issues, and it's important to get them right. We're focused on combating fake news. We're also focused on providing increased transparency into political advertisements and pages that have political content. And we're also investing a lot in our technical tools that help keep inauthentic accounts off the site.
INSKEEP: Are you already going after fake accounts in that larger, more specific way in the United States here in 2018?
BICKERT: Yes. The tools that we have developed to more effectively catch fake accounts - they've improved a lot, and we are using them globally. We now are able to stop more than a million fake accounts at the time of creation every day.
INSKEEP: The publication of its internal standards is another signal that Facebook is having to acknowledge that it is effectively a publisher. It wants to find itself as a technology company, just a platform for other people's speech, but the founder8, Mark Zuckerberg, now accepts some responsibility for what is posted. Facebook was embarrassed when a famous old Vietnam War photo was mistakenly censored9 and then put back up. It's also had to tussle10 with authoritarian11 governments like Russia and Turkey that demand some posts be taken down. Just last weekend, Sri Lankan officials complained to The New York Times that Facebook was not responsive enough to complaints of hate speech. Monika Bickert says that when pressured by governments, the company at least tries to keep up speech that meets its standards.
What does this announcement suggest about the power your company has?
BICKERT: I think what it suggests is that we really want to respond to what the community wants. What we're hearing is that they want more clarity, and they want to know how we enforce these rules. That's why we're doing this. And we're actually hopeful that this is going to spark a conversation.
INSKEEP: But this is also a reminder12, you've got this enormous fire hose of speech, maybe the world's largest fire hose of speech, and you can turn that fire hose on or off. It's your choice.
BICKERT: I want to be very clear that when we make these policies, we don't do it in a vacuum. This is not my team sitting in a room in California saying, these will be the policies. Every time we adjust a policy, we have external input13 from experts around the world.
INSKEEP: The company that claims some 2 billion users around the world insists it is straining to work within the laws of every country while still allowing as much speech as it can.
1 vice | |
n.坏事;恶习;[pl.]台钳,老虎钳;adj.副的 | |
参考例句: |
|
|
2 harassment | |
n.骚扰,扰乱,烦恼,烦乱 | |
参考例句: |
|
|
3 bullying | |
v.恐吓,威逼( bully的现在分词 );豪;跋扈 | |
参考例句: |
|
|
4 unprecedented | |
adj.无前例的,新奇的 | |
参考例句: |
|
|
5 roiled | |
v.搅混(液体)( roil的过去式和过去分词 );使烦恼;使不安;使生气 | |
参考例句: |
|
|
6 drawn | |
v.拖,拉,拔出;adj.憔悴的,紧张的 | |
参考例句: |
|
|
7 inciting | |
刺激的,煽动的 | |
参考例句: |
|
|
8 Founder | |
n.创始者,缔造者 | |
参考例句: |
|
|
9 censored | |
受审查的,被删剪的 | |
参考例句: |
|
|
10 tussle | |
n.&v.扭打,搏斗,争辩 | |
参考例句: |
|
|
11 authoritarian | |
n./adj.专制(的),专制主义者,独裁主义者 | |
参考例句: |
|
|
12 reminder | |
n.提醒物,纪念品;暗示,提示 | |
参考例句: |
|
|
13 input | |
n.输入(物);投入;vt.把(数据等)输入计算机 | |
参考例句: |
|
|
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。