full transcript

From the Ted Talk by Sasha Luccioni: AI is dangerous, but not for the reasons you think


Unscramble the Blue Letters


And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make tolos for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.

And as these mledos are being doeepyld, are being wveon into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our eiemocons have AI in them. And it's really important that AI stays accessible so that we know both how it wkros and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to mruease AI's impact, we can start getting an idea of how bad they are and start adnersisdg them as we go. Start creating guardrails to protect society and the planet. And once we have this information, cimenaops can use it in order to say, OK, we're going to choose this model because it's more slaaistunbe, this model because it rtesceps copyright. Legislators who really need information to write laws, can use these tools to doevlep new regulation mechanisms or gevocnnare for AI as it gets deployed into society. And users like you and me can use this iiomaontrfn to choose AI models that we can trust, not to misrepresent us and not to misuse our data.

Open Cloze


And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make _____ for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.

And as these ______ are being ________, are being _____ into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our _________ have AI in them. And it's really important that AI stays accessible so that we know both how it _____ and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to _______ AI's impact, we can start getting an idea of how bad they are and start __________ them as we go. Start creating guardrails to protect society and the planet. And once we have this information, _________ can use it in order to say, OK, we're going to choose this model because it's more ___________, this model because it ________ copyright. Legislators who really need information to write laws, can use these tools to _______ new regulation mechanisms or __________ for AI as it gets deployed into society. And users like you and me can use this ___________ to choose AI models that we can trust, not to misrepresent us and not to misuse our data.

Solution


  1. companies
  2. respects
  3. tools
  4. information
  5. works
  6. woven
  7. deployed
  8. measure
  9. develop
  10. addressing
  11. sustainable
  12. models
  13. governance
  14. economies

Original Text


And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make tools for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.

And as these models are being deployed, are being woven into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our economies have AI in them. And it's really important that AI stays accessible so that we know both how it works and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to measure AI's impact, we can start getting an idea of how bad they are and start addressing them as we go. Start creating guardrails to protect society and the planet. And once we have this information, companies can use it in order to say, OK, we're going to choose this model because it's more sustainable, this model because it respects copyright. Legislators who really need information to write laws, can use these tools to develop new regulation mechanisms or governance for AI as it gets deployed into society. And users like you and me can use this information to choose AI models that we can trust, not to misrepresent us and not to misuse our data.

Frequently Occurring Word Combinations


ngrams of length 2

collocation frequency
ai models 9
image generation 4
large language 3
training ai 3
climate change 2
creating tools 2
understand ai 2
language models 2
environmental costs 2
future existential 2
tangible impacts 2
tool called 2
data sets 2
generation models 2

ngrams of length 3

collocation frequency
training ai models 2
image generation models 2


Important Words


  1. accessible
  2. addressing
  3. ai
  4. bad
  5. bias
  6. cell
  7. change
  8. choose
  9. climate
  10. code
  11. companies
  12. complex
  13. copyright
  14. creating
  15. data
  16. deployed
  17. develop
  18. economies
  19. engage
  20. event
  21. fabric
  22. feeds
  23. gender
  24. governance
  25. guardrails
  26. idea
  27. impact
  28. important
  29. information
  30. interest
  31. justice
  32. laws
  33. legislation
  34. legislators
  35. life
  36. measure
  37. mechanisms
  38. media
  39. misrepresent
  40. misuse
  41. model
  42. models
  43. order
  44. people
  45. phones
  46. planet
  47. presented
  48. professions
  49. protect
  50. regulation
  51. respects
  52. sadly
  53. single
  54. social
  55. societies
  56. society
  57. solution
  58. start
  59. stays
  60. sustainable
  61. systems
  62. terms
  63. tool
  64. tools
  65. trust
  66. understand
  67. users
  68. walks
  69. work
  70. works
  71. woven
  72. write