Word Embedding in PyTorch + Lightning
StatQuest with Josh Starmer StatQuest with Josh Starmer
1.17M subscribers
29,146 views
0

 Published On Nov 6, 2023

Word embedding is the first step in lots of neural networks, including Transformers (like ChatGPT) and other state of the art models. Here we learn how to code a stand alone word embedding network from scratch and with nn.Linear. We then learn how to load and use pre-trained word embedding values with nn.Embedding.

NOTE: This StatQuest assumes that you are already familiar with Word Embedding, if not, check out the 'Quest:    • Word Embedding and Word2Vec, Clearly ...  

If you'd like to support StatQuest, please consider...
Patreon:   / statquest  
...or...
YouTube Membership:    / @statquest  

...buying my book, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/

...or just donating to StatQuest!
paypal: https://www.paypal.me/statquest
venmo: @JoshStarmer

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
  / joshuastarmer  

0:00 Awesome song and introduction
1:53 Importing modules
2:48 Encoding the training data
6:55 Word Embedding from scratch
16:54 Graphing the embedding values
21:17 Printing out predicted words
20:37 Word Embedding with nn.Linear
28:12 Loading and using pre-trained Embedding values with nn.Embedding

#StatQuest #neuralnetworks #transformers

show more

Share/Embed