rasbt-LLMs-from-scratch/ch02
Sebastian Raschka 4ff743051e
BPE cosmetics (#629)
* Llama3 from scratch improvements

* Cosmetic BPE improvements

* restore

* Update ch02/05_bpe-from-scratch/bpe-from-scratch.ipynb

* Update ch02/05_bpe-from-scratch/bpe-from-scratch.ipynb

* endoftext whitespace
2025-04-18 18:57:09 -05:00
..
01_main-chapter-code Clarify dataset length in chapter 2 (#589) 2025-03-30 16:01:37 -05:00
02_bonus_bytepair-encoder Fix BPE bonus materials (#561) 2025-03-08 17:21:30 -06:00
03_bonus_embedding-vs-matmul minor spelling fix 2024-09-08 15:35:36 -05:00
04_bonus_dataloader-intuition fixed num_workers (#229) 2024-06-19 17:36:46 -05:00
05_bpe-from-scratch BPE cosmetics (#629) 2025-04-18 18:57:09 -05:00
README.md add link to supplementary ch02 video (#553) 2025-03-02 13:17:42 -06:00

Chapter 2: Working with Text Data

 

Main Chapter Code

 

Bonus Materials

  • 02_bonus_bytepair-encoder contains optional code to benchmark different byte pair encoder implementations

  • 03_bonus_embedding-vs-matmul contains optional (bonus) code to explain that embedding layers and fully connected layers applied to one-hot encoded vectors are equivalent.

  • 04_bonus_dataloader-intuition contains optional (bonus) code to explain the data loader more intuitively with simple numbers rather than text.

  • 05_bpe-from-scratch contains (bonus) code that implements and trains a GPT-2 BPE tokenizer from scratch.

In the video below, I provide a code-along session that covers some of the chapter contents as supplementary material.



Link to the video