mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2026-04-11 14:21:41 +08:00
* Show epochs as integers on x-axis * Update ch07/01_main-chapter-code/previous_chapters.py * remove extra s * modify exercise plots * update chapter 7 plot * resave ch07 for better file diff |
||
|---|---|---|
| .. | ||
| 01_main-chapter-code | ||
| 02_alternative_weight_loading | ||
| 03_bonus_pretraining_on_gutenberg | ||
| 04_learning_rate_schedulers | ||
| 05_bonus_hparam_tuning | ||
| README.md | ||
Chapter 5: Pretraining on Unlabeled Data
Main Chapter Code
- 01_main-chapter-code contains the main chapter code
Bonus Materials
- 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
- 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
- 04_learning_rate_schedulers contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
- 05_bonus_hparam_tuning contains an optional hyperparameter tuning script