full transcript
From the Ted Talk by Sasha Luccioni: AI is dangerous, but not for the reasons you think
Unscramble the Blue Letters
And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make tolos for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.
And as these mledos are being doeepyld, are being wveon into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our eiemocons have AI in them. And it's really important that AI stays accessible so that we know both how it wkros and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to mruease AI's impact, we can start getting an idea of how bad they are and start adnersisdg them as we go. Start creating guardrails to protect society and the planet. And once we have this information, cimenaops can use it in order to say, OK, we're going to choose this model because it's more slaaistunbe, this model because it rtesceps copyright. Legislators who really need information to write laws, can use these tools to doevlep new regulation mechanisms or gevocnnare for AI as it gets deployed into society. And users like you and me can use this iiomaontrfn to choose AI models that we can trust, not to misrepresent us and not to misuse our data.
Open Cloze
And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make _____ for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.
And as these ______ are being ________, are being _____ into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our _________ have AI in them. And it's really important that AI stays accessible so that we know both how it _____ and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to _______ AI's impact, we can start getting an idea of how bad they are and start __________ them as we go. Start creating guardrails to protect society and the planet. And once we have this information, _________ can use it in order to say, OK, we're going to choose this model because it's more ___________, this model because it ________ copyright. Legislators who really need information to write laws, can use these tools to _______ new regulation mechanisms or __________ for AI as it gets deployed into society. And users like you and me can use this ___________ to choose AI models that we can trust, not to misrepresent us and not to misuse our data.
Solution
- companies
- respects
- tools
- information
- works
- woven
- deployed
- measure
- develop
- addressing
- sustainable
- models
- governance
- economies
Original Text
And sadly, my tool hasn't been used to write legislation yet. But I recently presented it at a UN event about gender bias as an example of how we can make tools for people from all walks of life, even those who don't know how to code, to engage with and better understand AI because we use professions, but you can use any terms that are of interest to you.
And as these models are being deployed, are being woven into the very fabric of our societies, our cell phones, our social media feeds, even our justice systems and our economies have AI in them. And it's really important that AI stays accessible so that we know both how it works and when it doesn't work. And there's no single solution for really complex things like bias or copyright or climate change. But by creating tools to measure AI's impact, we can start getting an idea of how bad they are and start addressing them as we go. Start creating guardrails to protect society and the planet. And once we have this information, companies can use it in order to say, OK, we're going to choose this model because it's more sustainable, this model because it respects copyright. Legislators who really need information to write laws, can use these tools to develop new regulation mechanisms or governance for AI as it gets deployed into society. And users like you and me can use this information to choose AI models that we can trust, not to misrepresent us and not to misuse our data.
Frequently Occurring Word Combinations
ngrams of length 2
collocation |
frequency |
ai models |
9 |
image generation |
4 |
large language |
3 |
training ai |
3 |
climate change |
2 |
creating tools |
2 |
understand ai |
2 |
language models |
2 |
environmental costs |
2 |
future existential |
2 |
tangible impacts |
2 |
tool called |
2 |
data sets |
2 |
generation models |
2 |
ngrams of length 3
collocation |
frequency |
training ai models |
2 |
image generation models |
2 |
Important Words
- accessible
- addressing
- ai
- bad
- bias
- cell
- change
- choose
- climate
- code
- companies
- complex
- copyright
- creating
- data
- deployed
- develop
- economies
- engage
- event
- fabric
- feeds
- gender
- governance
- guardrails
- idea
- impact
- important
- information
- interest
- justice
- laws
- legislation
- legislators
- life
- measure
- mechanisms
- media
- misrepresent
- misuse
- model
- models
- order
- people
- phones
- planet
- presented
- professions
- protect
- regulation
- respects
- sadly
- single
- social
- societies
- society
- solution
- start
- stays
- sustainable
- systems
- terms
- tool
- tools
- trust
- understand
- users
- walks
- work
- works
- woven
- write