在线英语听力室

美国国家公共电台 NPR 'Automating Inequality': Algorithms In Public Services Often Fail The Most Vulnerable

时间:2018-02-22 08:28:39

搜索关注在线英语听力室公众号:tingroom,领取免费英语资料大礼包。

(单词翻译)

 

ARI SHAPIRO, HOST:

In the fall of 2008, an Indiana woman named Omega Young got a letter saying she needed to recertify for the state's public benefits program.

VIRGINIA EUBANKS: But she was unable to make the appointment because she was suffering from ovarian cancer.

SHAPIRO: She called the local office to say she wouldn't make the appointment because she was hospitalized getting cancer treatments and she lost her benefits anyway. The reason - failure to cooperate.

EUBANKS: So because she lost her benefits, she couldn't afford her medications, she lost her food stamps, she couldn't pay her rent. She lost access to free transportation to her medical appointments. And Omega Young died on March 1, 2009. And on the next day, she won an appeal for wrongful termination and all of her benefits were restored the day after her death.

SHAPIRO: This is one of the stories the author Virginia Eubanks tells in her latest book "Automating1 Inequality: How High-Tech2 Tools Profile, Police, And Punish The Poor." That book is the subject of this week's All Tech Considered.

(SOUNDBITE OF MUSIC)

SHAPIRO: Virginia Eubanks argues that many of the automated3 systems that deliver public services today are rigged against the people these programs are supposed to serve. She dives deep into three examples of automated public services - welfare benefits in Indiana, housing for the homeless in Los Angeles and children's services in Allegheny County, Pa., which includes Pittsburgh.

The Indiana case was so bad that the state eventually gave up on the automated system. Virginia Eubanks started by telling me what state lawmakers were trying to accomplish through automation.

EUBANKS: Indiana was attempting to save money and to make the system more efficient. But the way the system rolled out, it seems like one of the intentions was actually to break the relationship between caseworkers and the families they served. The governor sort of did a press tour around this contract. And one of the things he kept bringing up was there was one case where two case workers had colluded with some recipients4 to defraud5 the government for about - I think it was about $8,000.

And the governor used this case over and over and over again to suggest that when caseworkers and families have personal relationships, that it's an invitation to fraud. So the system was actually designed to break that relationship. So what happened is the state replaced about 1,500 local caseworkers with online forms and regional call centers.

And that resulted in a million benefits denials in the first three years of the experiment, which was a 54 percent increase from the three years before.

SHAPIRO: Is an automated system of public services inherently going to be less helpful, less effective than something like Uber or Lyft or Amazon or all the automated things that people who are not in poverty rely on every day?

EUBANKS: No. There's nothing intrinsic in automation that makes it bad for the poor. One of my greatest fears in this work is that we're actually using these systems to avoid some of the most pressing moral and political challenges of our time, specifically poverty and racism6. So we're kind of using these systems as a kind of empathy override7. You know, let's talk about Los Angeles.

So there's 58,000 unhoused folks in Los Angeles. It's the second-highest population in the United States and 75 percent of them are completely unsheltered, which means they're just living in the street. I do not want to be the case worker who is making that decision, who is saying there's 50,000 people with no resources. I have, you know, a handful of resources available. Now I have to pick.

But the problem is that we are using these tools to basically outsource that incredibly hard decision to machines.

SHAPIRO: So the underlying8 problem is not that the housing system is automated but it sure doesn't help that automating that system allows people to ignore, more or less, the fact that there are not enough houses.

EUBANKS: Yeah. So one of the folks I talked to in the book, this great, brilliant man Gary Blasi has one of the best quotes in the book and he says, homelessness is not a systems engineering problem. It's a carpentry problem, right?

SHAPIRO: If you've got 10 houses for 20 people, it doesn't matter how good the system for housing those people is, it's not going to work.

EUBANKS: Exactly.

SHAPIRO: As you point out in the book, caseworkers have biases10. There are case workers who are racist11, who discriminate12, who favor some clients over others for inappropriate reasons. Doesn't automation have the potential to solve those problems?

EUBANKS: Yeah, let's be absolutely direct about this that human bias9 in public assistance systems have created deep inequalities for decades. And it's specifically around the treatment of black and brown folks, who have often been either overrepresented in the more punitive13 systems or diverted from the more helpful systems because of frontline caseworker bias.

SHAPIRO: So they get thrown in prison more often or their children taken away more often, they get public housing less often, that sort of thing.

EUBANKS: Exactly. But the thing that's really important to understand about the systems I profile in "Automating Inequality" is that these systems don't actually remove that bias, they simply move it. So in Allegheny County where I look at the predictive model that's supposed to be able to forecast which children will be victims of abuse or neglect in the future, in that case, one of the hidden biases is that it uses proxies14 instead of actual measures of maltreatment.

And one of the proxies it uses is called call re-referral, which just means that a child is called on and then a second call comes in within two years. And the problem with this is that both anonymous15 reporters and mandated16 reporters report black and biracial families for abuse and neglect 3.5 times more often than they report white families.

SHAPIRO: You draw these three detailed17 pictures of automated systems falling short in Indiana, California, Pennsylvania. Do you think a different author could have found three different automated systems somewhere in the country that were working really well in providing services effectively?

EUBANKS: Absolutely. One of the things that's different about the way that I wrote the book is that I started from the point of view of the targets of these systems. It doesn't mean I only spoke18 to those folks. But I spoke to, you know, unhoused folks, both those who have had luck getting housing through coordinated19 entry and those who haven't. I spoke to families who have been investigated for maltreatment.

And I will say that when you start from the point of view of these very vulnerable families, that these systems look really different than they look from the point of view of the data scientists or administrators20 who are developing them. And I wasn't hearing these voices at all in the debates that we've been having about what's sort of coming to be known as algorithmic accountability or algorithmic fairness.

I was never hearing the voices of the people who face the pointy end of the most punitive stick. And I really thought it was important to bring those stories to the table.

SHAPIRO: Virginia Eubanks, thanks so much for talking with us.

EUBANKS: Thank you so much.

SHAPIRO: Her book is called "Automating Inequality: How High-Tech Tools Profile, Police, And Punish The Poor."


分享到:


点击收听单词发音收听单词发音  

1 automating 2b259dca6072e7443e207b0e02234c2e     
(使)自动化( automate的现在分词 )
参考例句:
  • Have you ever thought about automating any part of your business? 你有没有想过把你公司的某个部门自动化?
  • We are in process of automating the production department. 我们正在对生产部门实行自动化。
2 high-tech high-tech     
adj.高科技的
参考例句:
  • The economy is in the upswing which makes high-tech services in more demand too.经济在蓬勃发展,这就使对高科技服务的需求量也在加大。
  • The quest of a cure for disease with high-tech has never ceased. 人们希望运用高科技治疗疾病的追求从未停止过。
3 automated fybzf9     
a.自动化的
参考例句:
  • The entire manufacturing process has been automated. 整个生产过程已自动化。
  • Automated Highway System (AHS) is recently regarded as one subsystem of Intelligent Transport System (ITS). 近年来自动公路系统(Automated Highway System,AHS),作为智能运输系统的子系统之一越来越受到重视。
4 recipients 972af69bf73f8ad23a446a346a6f0fff     
adj.接受的;受领的;容纳的;愿意接受的n.收件人;接受者;受领者;接受器
参考例句:
  • The recipients of the prizes had their names printed in the paper. 获奖者的姓名登在报上。 来自《简明英汉词典》
  • The recipients of prizes had their names printed in the paper. 获奖者名单登在报上。 来自《现代英汉综合大词典》
5 defraud Em9zu     
vt.欺骗,欺诈
参考例句:
  • He passed himself off as the managing director to defraud the bank.他假冒总经理的名义诈骗银行。
  • He is implicated in the scheme to defraud the government.他卷入了这起欺骗政府的阴谋。
6 racism pSIxZ     
n.民族主义;种族歧视(意识)
参考例句:
  • He said that racism is endemic in this country.他说种族主义在该国很普遍。
  • Racism causes political instability and violence.种族主义道致政治动荡和暴力事件。
7 override sK4xu     
vt.不顾,不理睬,否决;压倒,优先于
参考例句:
  • The welfare of a child should always override the wishes of its parents.孩子的幸福安康应该永远比父母的愿望来得更重要。
  • I'm applying in advance for the authority to override him.我提前申请当局对他进行否决。
8 underlying 5fyz8c     
adj.在下面的,含蓄的,潜在的
参考例句:
  • The underlying theme of the novel is very serious.小说隐含的主题是十分严肃的。
  • This word has its underlying meaning.这个单词有它潜在的含义。
9 bias 0QByQ     
n.偏见,偏心,偏袒;vt.使有偏见
参考例句:
  • They are accusing the teacher of political bias in his marking.他们在指控那名教师打分数有政治偏见。
  • He had a bias toward the plan.他对这项计划有偏见。
10 biases a1eb9034f18cae637caab5279cc70546     
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹
参考例句:
  • Stereotypes represent designer or researcher biases and assumptions, rather than factual data. 它代表设计师或者研究者的偏见和假设,而不是实际的数据。 来自About Face 3交互设计精髓
  • The net effect of biases on international comparisons is easily summarized. 偏差对国际比较的基本影响容易概括。
11 racist GSRxZ     
n.种族主义者,种族主义分子
参考例句:
  • a series of racist attacks 一连串的种族袭击行为
  • His speech presented racist ideas under the guise of nationalism. 他的讲话以民族主义为幌子宣扬种族主义思想。
12 discriminate NuhxX     
v.区别,辨别,区分;有区别地对待
参考例句:
  • You must learn to discriminate between facts and opinions.你必须学会把事实和看法区分出来。
  • They can discriminate hundreds of colours.他们能分辨上百种颜色。
13 punitive utey6     
adj.惩罚的,刑罚的
参考例句:
  • They took punitive measures against the whole gang.他们对整帮人采取惩罚性措施。
  • The punitive tariff was imposed to discourage tire imports from China.该惩罚性关税的征收是用以限制中国轮胎进口的措施。
14 proxies e2a6fe7fe7e3bc554e51dce24e3945ee     
n.代表权( proxy的名词复数 );(测算用的)代替物;(对代理人的)委托书;(英国国教教区献给主教等的)巡游费
参考例句:
  • SOCKS and proxies are unavailable. Try connecting to XX again? socks和代理不可用。尝试重新连接到XX吗? 来自互联网
  • All proxies are still down. Continue with direct connections? 所有的代理仍然有故障。继续直接连接吗? 来自互联网
15 anonymous lM2yp     
adj.无名的;匿名的;无特色的
参考例句:
  • Sending anonymous letters is a cowardly act.寄匿名信是懦夫的行为。
  • The author wishes to remain anonymous.作者希望姓名不公开。
16 mandated b1de99702d7654948b507d8fbbea9700     
adj. 委托统治的
参考例句:
  • Mandated desegregation of public schools. 命令解除公立学校中的种族隔离
  • Britain was mandated to govern the former colony of German East Africa. 英国受权代管德国在东非的前殖民地。
17 detailed xuNzms     
adj.详细的,详尽的,极注意细节的,完全的
参考例句:
  • He had made a detailed study of the terrain.他对地形作了缜密的研究。
  • A detailed list of our publications is available on request.我们的出版物有一份详细的目录备索。
18 spoke XryyC     
n.(车轮的)辐条;轮辐;破坏某人的计划;阻挠某人的行动 v.讲,谈(speak的过去式);说;演说;从某种观点来说
参考例句:
  • They sourced the spoke nuts from our company.他们的轮辐螺帽是从我们公司获得的。
  • The spokes of a wheel are the bars that connect the outer ring to the centre.辐条是轮子上连接外圈与中心的条棒。
19 coordinated 72452d15f78aec5878c1559a1fbb5383     
adj.协调的
参考例句:
  • The sound has to be coordinated with the picture. 声音必须和画面协调一致。
  • The numerous existing statutes are complicated and poorly coordinated. 目前繁多的法令既十分复杂又缺乏快调。 来自英汉非文学 - 环境法 - 环境法
20 administrators d04952b3df94d47c04fc2dc28396a62d     
n.管理者( administrator的名词复数 );有管理(或行政)才能的人;(由遗嘱检验法庭指定的)遗产管理人;奉派暂管主教教区的牧师
参考例句:
  • He had administrators under him but took the crucial decisions himself. 他手下有管理人员,但重要的决策仍由他自己来做。 来自辞典例句
  • Administrators have their own methods of social intercourse. 办行政的人有他们的社交方式。 来自汉英文学 - 围城

本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎 点击提交 分享给大家。