AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf

上传人:pei****hi 文档编号:875334 上传时间:2019-08-13 格式:PDF 页数:34 大小:431.82KB
返回 下载 相关 举报
AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf_第1页
第1页 / 共34页
AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf_第2页
第2页 / 共34页
点击查看更多>>
资源描述

《AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf》由会员分享,可在线阅读,更多相关《AI Now-人工智能(AI)多元化危机(英文)-2019.4-33页.pdf.pdf(34页珍藏版)》请在得力文库 - 分享文档赚钱的网站上搜索。

1、DISCRIMINATING SYSTEMS Gender, Race, and Power in AISarah Myers West, AI Now Institute, New York University Meredith Whittaker, AI Now Institute, New York University, Google Open Research Kate Crawford, AI Now Institute, New York University, Microsoft ResearchAPRIL 2019Cite as: West, S.M., Whittaker

2、, M. and Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. Retrieved from https:/ainowinstitute.org/ discriminatingsystems.html.This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International LicenseRESEARCH FINDINGSRECOMMENDATION

3、SINTRODUCTIONWHICH HUMANS ARE IN THE LOOP? HOW WORKFORCES AND AI SYSTEMS INTERACTWHO MAKES AI?Diversity Statistics in the AI Industry: Knowns and UnknownsFROM WORKFORCES TO AI SYSTEMS: THE DISCRIMINATION FEEDBACK LOOPCORPORATE DIVERSITY: BEYOND THE PIPELINE PROBLEMCore Themes in Pipeline ResearchLim

4、itations of Pipeline ResearchPipeline Dreams: After Years of Research, The Picture WorsensWORKER-LED INITIATIVESTHE PUSHBACK AGAINST DIVERSITYCONCLUSION3458 1012 15 19 212325 262832CONTENTSRESEARCH FINDINGS There is a diversity crisis in the AI sector across gender and race. Recent studies found onl

5、y 18% of authors at leading AI conferences are women,i and more than 80% of AI professors are men.ii This disparity is extreme in the AI industry:iii women comprise only 15% of AI research staff at Facebook and 10% at Google. There is no public data on trans workers or other gender minorities. For b

6、lack workers, the picture is even worse. For example, only 2.5% of Googles workforce is black, while Facebook and Microsoft are each at 4%. Given decades of concern and investment to redress this imbalance, the current state of the field is alarming. The AI sector needs a profound shift in how it ad

7、dresses the current diversity crisis. The AI industry needs to acknowledge the gravity of its diversity problem, and admit that existing methods have failed to contend with the uneven distribution of power, and the means by which AI can reinforce such inequality. Further, many researchers have shown

8、 that bias in AI systems reflects historical patterns of discrimination. These are two manifestations of the same problem, and they must be addressed together. The overwhelming focus on women in tech is too narrow and likely to privilege white women over others. We need to acknowledge how the inters

9、ections of race, gender, and other identities and attributes shape peoples experiences with AI. The vast majority of AI studies assume gender is binary, and commonly assign people as male or female based on physical appearance and stereotypical assumptions, erasing all other forms of gender identity

10、.Fixing the pipeline wont fix AIs diversity problems. Despite many decades of pipeline studies that assess the flow of diverse job candidates from school to industry, there has been no substantial progress in diversity in the AI industry. The focus on the pipeline has not addressed deeper issues wit

11、h workplace cultures, power asymmetries, harassment, exclusionary hiring practices, unfair compensation, and tokenization that are causing people to leave or avoid working in the AI sector altogether. The use of AI systems for the classification, detection, and prediction of race and gender is in ur

12、gent need of re-evaluation. The histories of race science are a grim reminder that race and gender classification based on appearance is scientifically flawed and easily abused. Systems that use physical appearance as a proxy for character or interior states are deeply suspect, including AI tools th

13、at claim to detect sexuality from headshots,iv predict criminality based on facial features,v or assess worker competence via micro-expressions. vi Such systems are replicating patterns of racial and gender bias in ways that can deepen and justify historical inequality. The commercial deployment of

14、these tools is cause for deep concern. i. Element AI. (2019). Global AI Talent Report 2019. Retrieved from https:/jfgagne.ai/talent-2019/. ii. AI Index 2018. (2018). Artificial Intelligence Index 2018. Retrieved from http:/cdn.aiindex.org/2018/AI%20Index%202018%20Annual%20Report.pdf. iii. Simonite,

15、T. (2018). AI is the future - but where are the women? WIRED. Retrieved from https:/ researchers-gender-imbalance/. iv. Wang, Y., Ensmenger, N. (2015). Beards, Sandals, and Other Signs of Rugged Individualism: Masculine Culture within the Computing Professions. Osiris, 30(1): 38-65. 32 Hicks, M. (20

16、17). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge: MIT Press, 16. 33 Microsoft Gender Case (2019, Apr. 12). Retrieved from https:/ 34 Bensinger, G. (2018, July 16). Uber Faces Federal Investigation Over Alleged Gender Discrimination. The

17、Wall Street Journal. Retrieved from https:/ 35 Goldman, D. (2015, June 8). Tim Cook: Youll soon see more women representing Apple. CNN. Retrieved from https:/ technology/tim-cook-women-apple/?iid=EL. 36 OBrien, S.A. (2016, Jan. 15). Apples board calls diversity proposal unduly burdensome and not ne

18、cessary. CNN. Retrieved from https:/n. com/2016/01/15/technology/apple-diversity/index.html. 37 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/ magazine/2017/11/20/the-tech-industrys-gender-discrimination-problem. 38 Luckie, M.

19、(2018, Nov. 27). Facebook is failing its black employees and its black users. Facebook. https:/ facebook-is-failing-its-black-employees-and-its-black-users/1931075116975013/. 39 Kolhatkar, S. (2017, Nov. 13). The Tech Industrys Gender-Discrimination Problem. The New Yorker. Retrieved from https:/ ma

20、gazine/2017/11/20/the-tech-industrys-gender-discrimination-problem.Discriminating Systems: Gender, Race, and Power in AI | Which Humans are in the Loop? How Workforces and AI Systems Interact | 9From this perspective, locating individual biases within a given technical systemand attempting to fix th

21、em by tweaking the systembecomes an exercise in futility. Only by examining discrimination through the lens of its social logics (who it benefits, who it harms, and how) can we see the workings of these systems in the context of existing power relationships.In addition to asking when and how AI syst

22、ems favor some identities over others we might also ask: what is the logic through which artificial intelligence “sees” and constructs gender and race to begin with? How does it engage in the production and enactment of new classifications and identities?40 And how do AI systems replicate historical

23、 hierarchies by rendering people along a continuum of least to most “valuable”? These questions point to the larger problem: it is not just that AI systems need to be fixed when they misrecognize faces or amplify stereotypes. It is that they can perpetuate existing forms of structural inequality eve

24、n when working as intended.To tackle these questions, our research traces the way gender and race surfaces in AI systems and workforces, and their interrelationship. First, we review what is known and not known about diversity in the field of AI, focusing particularly on how frames devoted to the ST

25、EM field pipeline have dominated the discourse. Then, we provide a brief summary of existing literature on gender and racial bias in technologies and where this literature could be extended. Finally, we look at how calls for diversity in tech have been ignored or resisted, and how these discriminato

26、ry views have permeated many AI systems. We conclude by sharing new research findings that point to ways in which a deeper analysis of gender, race, and power in the field of AI can help to redress inequalities in the industry and in the tools it produces.WHO MAKES AI?The current data on the state o

27、f gender diversity in the AI field is dire, in both industry and academia. For example, in 2013, the share of women in computing dropped to 26%, below their level in 1960.41 Almost half the women who go into technology eventually leave the field, more than double the percentage of men who depart.42

28、As noted above, a report produced by the research firm Element AI found that only 18% of authors at the leading 21 conferences in the field are women,43 while the 2018 Artificial Intelligence Index reports 80% of AI professors are men.44 This imbalance is replicated at large tech firms like Facebook

29、 and Google, whose websites show 40 Kloppenburg, S. and van der Ploeg, I. (2018). Securing Identities: Biometric Technologies and the Enactment of Human Bodily Differences. Science as Culture. 41 Thompson, C. (2019, Feb. 13). The Secret History of Women in Coding. New York Times Magazine. Retrieved

30、from https:/www.nytimes. com/2019/02/13/magazine/women-coding-computer-programming.html?linkId=65692573. 42 Ashcraft, C., McLain, B. and Eger, E. (2016). Women in Tech: The Facts. National Center for Women in Information Technology. Retrieved from https:/www.ncwit.org/sites/default/files/resources/w

31、omenintech_facts_fullreport_05132016.pdf. 43 Element AI. (2019). Global AI Talent Report 2019. Retrieved from https:/jfgagne.ai/talent-2019/. 44 AI Index 2018. (2018). Artificial Intelligence Index 2018. Retrieved from http:/cdn.aiindex.org/2018/AI%20Index%202018%20Annual%20Report.pdf.Discriminating

32、 Systems: Gender, Race, and Power in AI | Which Humans are in the Loop? How Workforces and AI Systems Interact | 10even greater imbalances, with women comprising only 15% and 10% of their AI research staff, respectively.45,46 There is no reported data on trans workers or other gender minorities. The

33、 state of racial diversity in AI is even worse. Only 2.5% of Googles full-time workers are black, and 3.6% latinx, with black workers having the highest attrition rate of all racial categories.47 Facebook isnt much better: the company reported that with 4% black workers and 5% Hispanic workers in 20

34、18, the companys diversity is improving.48 Microsoft reflects similar levels as Facebook, with 4% black workers, and 6% Latinx workers.49 Machine vision researcher and co- founder of Black in AI, Timnit Gebru, said that when she first attended the preeminent machine learning conference NeurIPS in 20

35、16, she was one of six black people out of 8,500 attendees.50 “We are in a diversity crisis for AI,” Gebru explains. “In addition to having technical conversations, conversations about law, conversations about ethics, we need to have conversations about diversity in AI. This needs to be treated as s

36、omething thats extremely urgent.”51 Of course, artificial intelligence is a sub-field of computer science, and the broader discipline is experiencing an historic low point for diversity: as of 2015, women made up only 18% of computer science majors in the United States, a decline from a high of 37%

37、in 1984.52 No other professional field has experienced such a sharp decline in the number of women in its ranks.53 At present, women currently make up 24.4% of the computer science workforce, and receive median salaries that are only 66% of the salaries of their male counterparts. These figures are

38、similarly pronounced when race is taken into account; the proportion of bachelors degree awards in engineering to black women declined 11% between 2000 and 2015.54 The number of women and people of color decreased at the same time that the tech industry was establishing itself as a nexus of wealth a

39、nd power. This is even more significant when we recognize that these shocking diversity figures are not reflective of STEM as a whole: in fields outside of computer science and AI, racial and gender diversity has shown a marked improvement.55 45 Simonite, T. (2018). AI is the future - but where are

40、the women? WIRED. Retrieved from https:/ researchers-gender-imbalance/. 46 The World Economic Forums 2018 Global Gender Gap Report includes a section on diversity in AI that places its estimate much higher at 22%. However, the methodology for obtaining this figure raises some questions: it relies on

41、 LinkedIn users inclusion of AI-related skills in their profiles as the primary data source. This requires several causal leaps: first, that a sample of LinkedIn users is representative of the global population of workers in the field of AI, and that these users accurately represented their skill se

42、t. Moreover, the study used a flawed mechanism to attribute gender on a binary basis to users on the basis of inference from their first name a practice that is not only trans-exclusionary, but is particularly problematic in an analysis that includes names in non-English languages. 47 Google. (2018)

43、. Google Diversity Annual Report 2018. Retrieved from https:/ Google_Diversity_annual_report_2018.pdf. 48 Williams, M. (2018, July 12). Facebook 2018 Diversity Report: Reflecting on Our Journey. Retrieved from https:/ diversity-report/. 49 Microsoft. (2019). Diversity whatever claim to a right to pr

44、ivacy that we may have is diminished by a state that believes that we must always be watched and seen.”99Asking this question is particularly important given that practices involved in correcting such biases sometimes lead those developing such technologies (most often large corporations) to conduct

45、 invasive data collection on communities that are already marginalized with the goal of ensuring that theyre represented. For example, facial recognition systems often have a challenging time recognizing the faces of people undergoing gender transition. This error has been a problem for trans Uber d

46、rivers, because the facial recognition system built in as a security feature by Uber has led their accounts to be suspended, preventing them from being able to work while they seek to get their accounts restored.100These harms should be balanced against remedies that rely on unethical practices, or

47、that propose mass data collection as the solution to bias. One approach that received particular pushback collected videos from transgender YouTubers without their consent in order to train facial recognition software to more accurately recognize people undergoing the process of transitioning.101 In this case, allowing alternate means of account verification may be a better “fix” than continuing to rely on a system whose efficacy demands increased surveillance and worker control.96 Ibid. 97 Powles, J. (2018, Dec. 7). The Seductive Diversion of Solving Bias in Artificial Intell

展开阅读全文
相关资源
相关搜索

当前位置:首页 > 研究报告 > 可研报告

本站为文档C TO C交易模式,本站只提供存储空间、用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。本站仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知得利文库网,我们立即给予删除!客服QQ:136780468 微信:18945177775 电话:18904686070

工信部备案号:黑ICP备15003705号-8 |  经营许可证:黑B2-20190332号 |   黑公网安备:91230400333293403D

© 2020-2023 www.deliwenku.com 得利文库. All Rights Reserved 黑龙江转换宝科技有限公司 

黑龙江省互联网违法和不良信息举报
举报电话:0468-3380021 邮箱:hgswwxb@163.com