'Racist' Technology Is a Bug—Not a Crime

2020-02-19 03:02:33    阅读:323789

IDEAS John McWhorter is an associate professor of English and comparative litera

ture at Columbia University. We are told of late that we must entertain whether technology can be a racist. Like when Google Photos, in 2015, algorithmically identified black people as gorillas. Or earlier this year when Microsofts Twit大水之分打一生肖 terbot Tay, designed to e

mulate human conversation by trawling tweets, sucked up racist nonsense along with everything else and started spouting some of its own. Or when, in August, Snapchat offered a selfie-altering filter that rendered users as an offensive Asian caricature. Of course these things should not, once noticed, stay as they are. (All of the examples above were altered or taken down.) But are they racisti.e., evidence that contempt for racial minorities is the warp and woof of our society still? The fact that we are trained to approach such things from that perspective calls for some words from James Baldwin, someone few consider to have ever gotten much wrong on race. Here he is in 1962: I do not know many Negroes who are eager to be accepted by white people, still less to be loved by them; they, the blacks, simply dont wish to be beaten over the head by the whites every instant of our brief passage on this planet.  Sign up for Inside TIME. Be the first to see the new cover of TIME and get our most compelling stories delivered straight to your inbox. Thank you! For your security, we've sent a confirmation email to the address you entered. Click the link to confirm your subscription and begin receiving our newsletters. If you don't get the confirmation within 10 minutes, please check your spam folder. To Baldwin, the issue was getting rid of segregation and police brutality, not cleansing whites hearts of all racist sentiment, which blacks of his generation considered beside the point, not to mention impossible. I suspect many today concurBaldwins quote is very Black Lives Matterbut too often we get our heads turned in unproductive directions. This leads to an obsession not with racism as an obstacle to achievement, but with racism as a social stain to rub out; its like trying to shame people who dont recycle or floss. Machines cannot, themselves, be racists. Even equipped with artificial intelligence, they have neither brains nor intention. The question worth asking is whether the people who created a given technology qualify as racists. We can dismiss the idea that wonks dreaming up these mechanisms deliberately intend to offend people. No one at Google giggled while intentionally programming its software to mislabel black people. Microsofts engineers were horrified by their Frankenstein Twitterbot. All three of these flubs were just that, unintentional outcomes that their creators were quick to regret and correct. For example, shou

ld we expect these creators to have anticipated that software that codes gorillas as black in color, and perhaps having fullish lips, might apply the same label to black people? They may well have assumed that the recognition software was programmed richly enough to recognize specifi

cally human traits that they needed not worry about this. To instead take the occasion to flay Silicon Valley for not hiring enough black people is hasty: Can we really be certain that a design team with more black programmers would not have made the same flub? Tays programmers, meanwhile, would hardly be alone in underestimating the degree of vicious idiocy on Twitter, and may have assumed that Tays being asked normal, neutral questions would not have sparked links to noxious vitriol. These were, in a word, bugs. Bugs in programs involved chiefly in labeling and language are bound, at some point, to create offense. 

  Spotlight Story Kobe Bryant Had a Singular Impact on His Game and the World  Bryant died in a helicopter crash near Los Angeles on Sunday, along with his daughter Gianna  The Snapchat filter stands out. Clearly somebody in creative there is on the clueless side (and Snapchat is one of the tech companies that refuses to report on the racial composition of its staff). However, cluelessness is not bi

gotry. Eyes like the ones used by Snapchat are legion in anime-derived emojis, for example. A sane person could well assume, although perhaps in haste, that such a facelet was within the bounds of decency.

Our culture has gotten to the point that we consider it our jobs

to say these mistakes indicate racism, with the implication that the designers and the people who hire them are therefore racists. This disproportionate disgust is a touch medieval in two ways. Imputing bigotry to a computer program is like imputing a spirit to a tree. And calling a Silicon Valley computer programmer a racist is like deeming ones innocent next-door neighbor a witch for being less than perfect. In a healthier moment there would be more room for saying that these people simply made a mistake.

This appears in the September 12, 2016 issue of TIME. Contact us at editors@time.com. IDEAS TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.