full transcript

From the Ted Talk by Eric Topol: Can AI catch what doctors miss?


Unscramble the Blue Letters


So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational psrgreos in many fields.

The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our huamn brain has 100 trillion cineonotncs or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now mamtdliuol with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised lainnreg, which is a big bonetelctk in medicine because we can't get experts to lebal iagems. This can be done with self-supervised learning.

Open Cloze


So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational ________ in many fields.

The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our _____ brain has 100 trillion ___________ or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now __________ with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised ________, which is a big __________ in medicine because we can't get experts to _____ ______. This can be done with self-supervised learning.

Solution


  1. images
  2. progress
  3. human
  4. multimodal
  5. bottleneck
  6. label
  7. learning
  8. connections

Original Text


So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational progress in many fields.

The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our human brain has 100 trillion connections or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now multimodal with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised learning, which is a big bottleneck in medicine because we can't get experts to label images. This can be done with self-supervised learning.

Frequently Occurring Word Combinations


ngrams of length 2

collocation frequency
machine vision 3
keyboard liberation 3
supervised learning 2
blood pressure 2
kidney disease 2
trillion connections 2



Important Words


  1. ability
  2. amount
  3. big
  4. bottleneck
  5. brain
  6. cited
  7. classic
  8. connections
  9. context
  10. deep
  11. experts
  12. fields
  13. graphic
  14. human
  15. images
  16. information
  17. interestingly
  18. involves
  19. items
  20. knowledge
  21. label
  22. language
  23. learning
  24. massive
  25. medicine
  26. models
  27. multimodal
  28. networks
  29. neural
  30. outgrowth
  31. packed
  32. parameters
  33. processing
  34. progress
  35. prototype
  36. put
  37. setting
  38. speech
  39. transformational
  40. transformer
  41. trillion
  42. units