rasbt-LLMs-from-scratch/ch05
2024-04-22 07:00:20 -05:00
..
01_main-chapter-code update numbering 2024-04-22 07:00:20 -05:00
02_alternative_weight_loading update numbering 2024-04-22 07:00:20 -05:00
03_bonus_pretraining_on_gutenberg update numbering 2024-04-22 07:00:20 -05:00
04_learning_rate_schedulers Add and link bonus material (#84) 2024-03-23 07:27:43 -05:00
05_bonus_hparam_tuning Return nan if val loader is empty (#124) 2024-04-20 08:02:30 -05:00
README.md Organized setup instructions (#115) 2024-04-10 22:09:46 -04:00

Chapter 5: Pretraining on Unlabeled Data

  • 01_main-chapter-code contains the main chapter code
  • 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
  • 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
  • [04_learning_rate_schedulers] contains code implementing a more sophisticated training function including learning rate schedulers and gradient clipping
  • 05_bonus_hparam_tuning contains an optional hyperparameter tuning script