full transcript
From the Ted Talk by Eric Topol: Can AI catch what doctors miss?
Unscramble the Blue Letters
So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational psrgreos in many fields.
The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our huamn brain has 100 trillion cineonotncs or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now mamtdliuol with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised lainnreg, which is a big bonetelctk in medicine because we can't get experts to lebal iagems. This can be done with self-supervised learning.
Open Cloze
So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational ________ in many fields.
The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our _____ brain has 100 trillion ___________ or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now __________ with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised ________, which is a big __________ in medicine because we can't get experts to _____ ______. This can be done with self-supervised learning.
Solution
- images
- progress
- human
- multimodal
- bottleneck
- label
- learning
- connections
Original Text
So when we go from the deep neural networks to transformer models, this classic pre-print, one of the most cited pre-prints ever, "Attention is All You Need," the ability to now be able to look at many more items, whether it be language or images, and be able to put this in context, setting up a transformational progress in many fields.
The prototype is, the outgrowth of this is GPT-4. With over a trillion connections. Our human brain has 100 trillion connections or parameters. But one trillion, just think of all the information, knowledge, that's packed into those one trillion. And interestingly, this is now multimodal with language, with images, with speech. And it involves a massive amount of graphic processing units. And it's with self-supervised learning, which is a big bottleneck in medicine because we can't get experts to label images. This can be done with self-supervised learning.
Frequently Occurring Word Combinations
ngrams of length 2
collocation |
frequency |
machine vision |
3 |
keyboard liberation |
3 |
supervised learning |
2 |
blood pressure |
2 |
kidney disease |
2 |
trillion connections |
2 |
Important Words
- ability
- amount
- big
- bottleneck
- brain
- cited
- classic
- connections
- context
- deep
- experts
- fields
- graphic
- human
- images
- information
- interestingly
- involves
- items
- knowledge
- label
- language
- learning
- massive
- medicine
- models
- multimodal
- networks
- neural
- outgrowth
- packed
- parameters
- processing
- progress
- prototype
- put
- setting
- speech
- transformational
- transformer
- trillion
- units