The spelled-out intro to language modeling: building makemore
We implement a bigram character-level language model, which we will further complexify in followup videos into a modern Transformer language model, like GPT. In this video, the focus is on (1) introducing and its subtleties and use in efficiently evaluating neural networks and (2) the overall framework of language modeling that includes model training, sampling, and the evaluation of a loss (e.g. the negative log likelihood for classification).
Links:
- makemore on github:
- jupyter notebook I built in this video:
- my website:
- my twitter:
Useful links for practice:
- Python Numpy tutorial from CS231n . We use instead of in this video. Their design (e.g. broadcasting, data types, etc.) is so similar that practicing one is basically practicing the other, just be careful with some of the APIs - how various functions are named, what arguments they take, etc. - these details can vary.
- PyTorch tutorial on Tensor
- Another PyTorch intro to Tensor
Exercises:
E01: train a trigram language model, i.e. take two characters as an input to predict the 3rd one. Feel free to use either counting or a neural net. Evaluate the loss; Did it improve over a bigram model?
E02: split up the dataset randomly into 80% train set, 10% dev set, 10% test set. Train the bigram and trigram models only on the training set. Evaluate them on dev and test splits. What can you see?
E03: use the dev set to tune the strength of smoothing (or regularization) for the trigram model - i.e. try many possibilities and see which one works best based on the dev set loss. What patterns can you see in the train and dev set loss as you tune this strength? Take the best setting of the smoothing and evaluate on the test set once and at the end. How good of a loss do you achieve?
E04: we saw that our 1-hot vectors merely select a row of W, so producing these vectors explicitly feels wasteful. Can you delete our use of in favor of simply indexing into rows of W?
E05: look up and use instead. You should achieve the same result. Can you think of why we’d prefer to use instead?
E06: meta-exercise! Think of a fun/interesting exercise and complete it.
Chapters:
00:00:00 intro
00:03:03 reading and exploring the dataset
00:06:24 exploring the bigrams in the dataset
00:09:24 counting bigrams in a python dictionary
00:12:45 counting bigrams in a 2D torch tensor (“training the model“)
00:18:19 visualizing the bigram tensor
00:20:54 deleting spurious (S) and (E) tokens in favor of a single . token
00:24:02 sampling from the model
00:36:17 efficiency! vectorized normalization of the rows, tensor broadcasting
00:50:14 loss function (the negative log likelihood of the data under our model)
01:00:50 model smoothing with fake counts
01:02:57 PART 2: the neural network approach: intro
01:05:26 creating the bigram dataset for the neural net
01:10:01 feeding integers into neural nets? one-hot encodings
01:13:53 the “neural net“: one linear layer of neurons implemented with matrix multiplication
01:18:46 transforming neural net outputs into probabilities: the softmax
01:26:17 summary, preview to next steps, reference to micrograd
01:35:49 vectorized loss
01:38:36 backward and update, in PyTorch
01:42:55 putting everything together
01:47:49 note 1: one-hot encoding really just selects a row of the next Linear layer’s weight matrix
01:50:18 note 2: model smoothing as regularization loss
01:54:31 sampling from the neural net
01:56:16 conclusion
3 views
9
3
3 weeks ago 00:39:19 1
Historian Answers Witchcraft Questions | Tech Support | WIRED
4 weeks ago 00:03:25 1
Lenny Kravitz - Honey (Official Music Video)
4 weeks ago 00:03:05 1
Jeorgia Rose - Undertone (Official Music Video)
4 weeks ago 00:03:15 1
A Writing Exercise to Help You Meet Yourself
4 weeks ago 00:02:04 1
Nightcore - Bad Luck (Lyrics)
4 weeks ago 00:32:13 1
ITALIAN STYLE FOR FALL 2024 🇮🇹 AUTUMN OUTFITS 🍁MILAN STREET FASHION #vanityfair #milanstreetstyle
4 weeks ago 00:16:06 1
Halloween MBTI Witches | Enter with Care! The Spell is Already Cast!
4 weeks ago 00:04:56 1
BEHEMOTH - At The Left Hand Ov God (Live From XXX Years Ov Blasphemy)
4 weeks ago 00:20:08 1
A guide to our alphabet
4 weeks ago 01:29:35 1
TRANCE 2024 VOL. 8 [FULL ALBUM]
4 weeks ago 00:05:03 1
Mirko Hirsch - Pandora’s Box 2020 Remix - New Generation of Italo Disco
4 weeks ago 00:03:15 1
Castle in the Sky - DJ Satomi
4 weeks ago 00:03:02 3
Hocus Pocus - I Put A Spell On You (Official Music Video)
4 weeks ago 00:02:42 1
Wasps Unleash “Ritual of the Damned“ - Epic Bug Muzak Halloween Special!
4 weeks ago 01:00:40 1
Sensual Deep Mix | Deep & Vocal House, Nu Disco, Chillout | Soundeo Mixtape + Gravity Recordings
4 weeks ago 00:23:21 1
Broken Peach - The Night Of The Halloween Specials (Live)
4 weeks ago 00:03:37 14
DRAN Like BRANN (a Great Dark Beyond pronunciation rap)
4 weeks ago 00:03:33 3
This Is Halloween except it’s terrifyingly horrible.
4 weeks ago 00:39:38 1
Does This Spell The End of Humanity? I Resistance Podcast #25 with Dr Tess Lawrie
4 weeks ago 01:00:29 1
Dismantling the American Empire (w/ Cornel West) | The Chris Hedges Report
4 weeks ago 00:03:37 7
Damian “Jr. Gong“ Marley - Banner (Official Music Video)