rasbt-LLMs-from-scratch/ch03/01_main-chapter-code
2024-02-14 20:23:59 -06:00
..
figures use attn_scores from sec 3.4 instead of 3.3 2024-02-14 20:23:59 -06:00
ch03.ipynb use attn_scores from sec 3.4 instead of 3.3 2024-02-14 20:23:59 -06:00
exercise-solutions.ipynb small readability updates 2024-01-14 11:58:42 -06:00
multihead-attention.ipynb <|endoftext|> token in dataset v1 2024-01-21 12:03:04 -06:00
README.md add and update readme files 2024-02-05 06:51:58 -06:00
small-text-sample.txt use block size variable in positional embedding layer 2023-12-28 19:05:06 +01:00

Chapter 3: Coding Attention Mechanisms

  • ch03.ipynb contains all the code as it appears in the chapter
  • multihead-attention.ipynb is a minimal notebook with the main data loading pipeline implemented in this chapter