full transcript

From the Ted Talk by Kostas Karpouzis: Can machines read your emotions?


Unscramble the Blue Letters


With every year, machines susarps hmunas in more and more activities we once thought only we were capable of. Today's computers can beat us in complex board gmeas, transcribe speech in dozens of languages, and instantly identify almost any object. But the robots of tooormrw may go futher by learning to figure out what we're felnieg. And why does that matter? Because if machines and the people who run them can accurately read our emotional states, they may be able to assist us or manipulate us at unprecedented scales. But before we get there, how can something so complex as eitmoon be converted into mere numbers, the only language mehcnias understand? Essentially the same way our own brains interpret emotions, by learning how to spot them. American pghlsocsoyit Paul Ekman identified certain universal emotions whose vsaiul cues are understood the same way across cultures. For example, an image of a smile signals joy to modern uabrn dwellers and agroiianbl tribesmen aklie. And according to Ekman, agner, disgust, fear, joy, sadness, and surprise are equally recognizable. As it trnus out, copmrutes are rapidly getting better at image rtnoiiogecn thanks to machine learning algorithms, such as neural networks. These consist of artificial ndeos that mimic our biological neurons by fiomrng connections and exchanging information. To train the network, sample inputs pre-classified into different categories, such as photos marked happy or sad, are fed into the system. The network then learns to classify those samples by adjusting the rltiavee weights assigned to particular features. The more training data it's given, the better the algorithm becomes at crotclery identifying new images. This is similar to our own brains, which learn from previous experiences to shape how new smlitui are processed. Recognition algorithms aren't just limtied to facial expressions. Our emotions manifest in many ways. There's body language and vocal tone, changes in heart rate, complexion, and skin temperature, or even word frequency and sentence structure in our writing. You might think that training neural networks to recognize these would be a long and complicated task until you realize just how much data is out there, and how quickly mredon computers can process it. From sacoil media posts, uploaded photos and vodeis, and phone recordings, to heat-sensitive scuetiry cameras and wearables that monitor physiological signs, the big question is not how to collect enough data, but what we're going to do with it. There are ptlney of beneficial uses for computerized emotion recognition. Robots using algorithms to identify facial expressions can help children learn or podrvie lonely people with a ssene of companionship. Social media companies are considering using algorithms to help prevent suicides by flagging posts that contain specific words or phrases. And emotion recognition stwraofe can help treat mental disorders or even provide poplee with low-cost automated psychotherapy. Despite the potential benefits, the prospect of a massive network automatically scanning our photos, cnunmitcooiams, and physiological snigs is also quite drsbtiuing. What are the implications for our privacy when such iprmoeasnl systems are used by corporations to exploit our emotions through advertising? And what becomes of our rights if authorities think they can itniedfy the people likely to commit crimes before they even make a ciconusos decision to act? Robots currently have a long way to go in distinguishing emotional nuances, like irony, and scales of emotions, just how happy or sad someone is. Nonetheless, they may eventually be able to accurately read our emotions and respond to them. Whether they can empathize with our fear of unwanted istronuin, however, that's another sorty.

Open Cloze


With every year, machines _______ ______ in more and more activities we once thought only we were capable of. Today's computers can beat us in complex board _____, transcribe speech in dozens of languages, and instantly identify almost any object. But the robots of ________ may go futher by learning to figure out what we're _______. And why does that matter? Because if machines and the people who run them can accurately read our emotional states, they may be able to assist us or manipulate us at unprecedented scales. But before we get there, how can something so complex as _______ be converted into mere numbers, the only language ________ understand? Essentially the same way our own brains interpret emotions, by learning how to spot them. American ____________ Paul Ekman identified certain universal emotions whose ______ cues are understood the same way across cultures. For example, an image of a smile signals joy to modern _____ dwellers and __________ tribesmen _____. And according to Ekman, _____, disgust, fear, joy, sadness, and surprise are equally recognizable. As it _____ out, _________ are rapidly getting better at image ___________ thanks to machine learning algorithms, such as neural networks. These consist of artificial _____ that mimic our biological neurons by _______ connections and exchanging information. To train the network, sample inputs pre-classified into different categories, such as photos marked happy or sad, are fed into the system. The network then learns to classify those samples by adjusting the ________ weights assigned to particular features. The more training data it's given, the better the algorithm becomes at _________ identifying new images. This is similar to our own brains, which learn from previous experiences to shape how new _______ are processed. Recognition algorithms aren't just _______ to facial expressions. Our emotions manifest in many ways. There's body language and vocal tone, changes in heart rate, complexion, and skin temperature, or even word frequency and sentence structure in our writing. You might think that training neural networks to recognize these would be a long and complicated task until you realize just how much data is out there, and how quickly ______ computers can process it. From ______ media posts, uploaded photos and ______, and phone recordings, to heat-sensitive ________ cameras and wearables that monitor physiological signs, the big question is not how to collect enough data, but what we're going to do with it. There are ______ of beneficial uses for computerized emotion recognition. Robots using algorithms to identify facial expressions can help children learn or _______ lonely people with a _____ of companionship. Social media companies are considering using algorithms to help prevent suicides by flagging posts that contain specific words or phrases. And emotion recognition ________ can help treat mental disorders or even provide ______ with low-cost automated psychotherapy. Despite the potential benefits, the prospect of a massive network automatically scanning our photos, ______________, and physiological _____ is also quite __________. What are the implications for our privacy when such __________ systems are used by corporations to exploit our emotions through advertising? And what becomes of our rights if authorities think they can ________ the people likely to commit crimes before they even make a _________ decision to act? Robots currently have a long way to go in distinguishing emotional nuances, like irony, and scales of emotions, just how happy or sad someone is. Nonetheless, they may eventually be able to accurately read our emotions and respond to them. Whether they can empathize with our fear of unwanted _________, however, that's another _____.

Solution


  1. machines
  2. disturbing
  3. psychologist
  4. people
  5. conscious
  6. forming
  7. security
  8. modern
  9. aboriginal
  10. stimuli
  11. intrusion
  12. identify
  13. visual
  14. humans
  15. sense
  16. social
  17. signs
  18. plenty
  19. impersonal
  20. alike
  21. communications
  22. videos
  23. emotion
  24. limited
  25. provide
  26. nodes
  27. urban
  28. software
  29. anger
  30. correctly
  31. tomorrow
  32. feeling
  33. story
  34. surpass
  35. recognition
  36. games
  37. turns
  38. computers
  39. relative

Original Text


With every year, machines surpass humans in more and more activities we once thought only we were capable of. Today's computers can beat us in complex board games, transcribe speech in dozens of languages, and instantly identify almost any object. But the robots of tomorrow may go futher by learning to figure out what we're feeling. And why does that matter? Because if machines and the people who run them can accurately read our emotional states, they may be able to assist us or manipulate us at unprecedented scales. But before we get there, how can something so complex as emotion be converted into mere numbers, the only language machines understand? Essentially the same way our own brains interpret emotions, by learning how to spot them. American psychologist Paul Ekman identified certain universal emotions whose visual cues are understood the same way across cultures. For example, an image of a smile signals joy to modern urban dwellers and aboriginal tribesmen alike. And according to Ekman, anger, disgust, fear, joy, sadness, and surprise are equally recognizable. As it turns out, computers are rapidly getting better at image recognition thanks to machine learning algorithms, such as neural networks. These consist of artificial nodes that mimic our biological neurons by forming connections and exchanging information. To train the network, sample inputs pre-classified into different categories, such as photos marked happy or sad, are fed into the system. The network then learns to classify those samples by adjusting the relative weights assigned to particular features. The more training data it's given, the better the algorithm becomes at correctly identifying new images. This is similar to our own brains, which learn from previous experiences to shape how new stimuli are processed. Recognition algorithms aren't just limited to facial expressions. Our emotions manifest in many ways. There's body language and vocal tone, changes in heart rate, complexion, and skin temperature, or even word frequency and sentence structure in our writing. You might think that training neural networks to recognize these would be a long and complicated task until you realize just how much data is out there, and how quickly modern computers can process it. From social media posts, uploaded photos and videos, and phone recordings, to heat-sensitive security cameras and wearables that monitor physiological signs, the big question is not how to collect enough data, but what we're going to do with it. There are plenty of beneficial uses for computerized emotion recognition. Robots using algorithms to identify facial expressions can help children learn or provide lonely people with a sense of companionship. Social media companies are considering using algorithms to help prevent suicides by flagging posts that contain specific words or phrases. And emotion recognition software can help treat mental disorders or even provide people with low-cost automated psychotherapy. Despite the potential benefits, the prospect of a massive network automatically scanning our photos, communications, and physiological signs is also quite disturbing. What are the implications for our privacy when such impersonal systems are used by corporations to exploit our emotions through advertising? And what becomes of our rights if authorities think they can identify the people likely to commit crimes before they even make a conscious decision to act? Robots currently have a long way to go in distinguishing emotional nuances, like irony, and scales of emotions, just how happy or sad someone is. Nonetheless, they may eventually be able to accurately read our emotions and respond to them. Whether they can empathize with our fear of unwanted intrusion, however, that's another story.

Frequently Occurring Word Combinations


ngrams of length 2

collocation frequency
accurately read 2
neural networks 2
facial expressions 2
social media 2
emotion recognition 2



Important Words


  1. aboriginal
  2. accurately
  3. act
  4. activities
  5. adjusting
  6. advertising
  7. algorithm
  8. algorithms
  9. alike
  10. american
  11. anger
  12. artificial
  13. assigned
  14. assist
  15. authorities
  16. automated
  17. automatically
  18. beat
  19. beneficial
  20. benefits
  21. big
  22. biological
  23. board
  24. body
  25. brains
  26. cameras
  27. capable
  28. categories
  29. children
  30. classify
  31. collect
  32. commit
  33. communications
  34. companies
  35. companionship
  36. complex
  37. complexion
  38. complicated
  39. computerized
  40. computers
  41. connections
  42. conscious
  43. consist
  44. converted
  45. corporations
  46. correctly
  47. crimes
  48. cues
  49. cultures
  50. data
  51. decision
  52. disgust
  53. disorders
  54. distinguishing
  55. disturbing
  56. dozens
  57. dwellers
  58. ekman
  59. emotion
  60. emotional
  61. emotions
  62. empathize
  63. equally
  64. essentially
  65. eventually
  66. exchanging
  67. experiences
  68. exploit
  69. expressions
  70. facial
  71. fear
  72. features
  73. fed
  74. feeling
  75. figure
  76. flagging
  77. forming
  78. frequency
  79. futher
  80. games
  81. happy
  82. heart
  83. humans
  84. identified
  85. identify
  86. identifying
  87. image
  88. images
  89. impersonal
  90. implications
  91. information
  92. inputs
  93. instantly
  94. interpret
  95. intrusion
  96. irony
  97. joy
  98. language
  99. languages
  100. learn
  101. learning
  102. learns
  103. limited
  104. lonely
  105. long
  106. machine
  107. machines
  108. manifest
  109. manipulate
  110. marked
  111. massive
  112. matter
  113. media
  114. mental
  115. mere
  116. mimic
  117. modern
  118. monitor
  119. network
  120. networks
  121. neural
  122. neurons
  123. nodes
  124. nuances
  125. numbers
  126. object
  127. paul
  128. people
  129. phone
  130. photos
  131. phrases
  132. physiological
  133. plenty
  134. posts
  135. potential
  136. prevent
  137. previous
  138. privacy
  139. process
  140. processed
  141. prospect
  142. provide
  143. psychologist
  144. psychotherapy
  145. question
  146. quickly
  147. rapidly
  148. rate
  149. read
  150. realize
  151. recognition
  152. recognizable
  153. recognize
  154. recordings
  155. relative
  156. respond
  157. rights
  158. robots
  159. run
  160. sad
  161. sadness
  162. sample
  163. samples
  164. scales
  165. scanning
  166. security
  167. sense
  168. sentence
  169. shape
  170. signals
  171. signs
  172. similar
  173. skin
  174. smile
  175. social
  176. software
  177. specific
  178. speech
  179. spot
  180. states
  181. stimuli
  182. story
  183. structure
  184. suicides
  185. surpass
  186. surprise
  187. system
  188. systems
  189. task
  190. temperature
  191. thought
  192. tomorrow
  193. tone
  194. train
  195. training
  196. transcribe
  197. treat
  198. tribesmen
  199. turns
  200. understand
  201. understood
  202. universal
  203. unprecedented
  204. unwanted
  205. uploaded
  206. urban
  207. videos
  208. visual
  209. vocal
  210. ways
  211. wearables
  212. weights
  213. word
  214. words
  215. writing
  216. year