full transcript

From the Ted Talk by Zeynep Tufekci: Machine intelligence makes human morals more important


Unscramble the Blue Letters


In Wisconsin, a defndaent was seetcennd to six years in prison for evading the police. You may not know this, but algorithms are increasingly used in parole and sentencing decisions. He wanted to know: How is this score calculated? It's a cmemiroacl black box. The cpnmoay refused to have its algorithm be challenged in open crout. But ProPublica, an ivvgnestiitae nonprofit, audited that very aiorlhgtm with what public data they could find, and found that its outcomes were biased and its predictive power was dismal, barely better than chance, and it was wrongly labeling black defendants as future criminals at twice the rate of white defendants.

Open Cloze


In Wisconsin, a _________ was _________ to six years in prison for evading the police. You may not know this, but algorithms are increasingly used in parole and sentencing decisions. He wanted to know: How is this score calculated? It's a __________ black box. The _______ refused to have its algorithm be challenged in open _____. But ProPublica, an _____________ nonprofit, audited that very _________ with what public data they could find, and found that its outcomes were biased and its predictive power was dismal, barely better than chance, and it was wrongly labeling black defendants as future criminals at twice the rate of white defendants.

Solution


  1. sentenced
  2. company
  3. investigative
  4. court
  5. algorithm
  6. defendant
  7. commercial

Original Text


In Wisconsin, a defendant was sentenced to six years in prison for evading the police. You may not know this, but algorithms are increasingly used in parole and sentencing decisions. He wanted to know: How is this score calculated? It's a commercial black box. The company refused to have its algorithm be challenged in open court. But ProPublica, an investigative nonprofit, audited that very algorithm with what public data they could find, and found that its outcomes were biased and its predictive power was dismal, barely better than chance, and it was wrongly labeling black defendants as future criminals at twice the rate of white defendants.

Frequently Occurring Word Combinations


ngrams of length 2

collocation frequency
computational systems 3
machine intelligence 3
human affairs 3
black box 3
computer programmer 2
human faces 2
anchor computation 2
artificial intelligence 2
sounds good 2
human resources 2
resources managers 2
human managers 2
predictive power 2
world war 2
war ii 2

ngrams of length 3

collocation frequency
human resources managers 2
world war ii 2


Important Words


  1. algorithm
  2. algorithms
  3. audited
  4. barely
  5. biased
  6. black
  7. box
  8. calculated
  9. challenged
  10. chance
  11. commercial
  12. company
  13. court
  14. criminals
  15. data
  16. decisions
  17. defendant
  18. defendants
  19. dismal
  20. evading
  21. find
  22. future
  23. increasingly
  24. investigative
  25. labeling
  26. nonprofit
  27. open
  28. outcomes
  29. parole
  30. police
  31. power
  32. predictive
  33. prison
  34. propublica
  35. public
  36. rate
  37. refused
  38. score
  39. sentenced
  40. sentencing
  41. wanted
  42. white
  43. wisconsin
  44. wrongly
  45. years