v0.8.3
What's Changed
- Add test to show global_max_sequence_length can never exceed an LLMs context length by @arnavgarg1 in #3548
- WandB: Add metric logging support on eval end and epoch end by @arnavgarg1 in #3586
- schema: Add
prompt
validation check by @ksbrar in #3564 - Unpin Transformers for CodeLlama support by @arnavgarg1 in #3592
- Add support for Paged Optimizers (Adam, Adamw), 8-bit optimizers, and new optimizers: LARS, LAMB and LION by @arnavgarg1 in #3588
- fix: Failure in TabTransformer Combiner Unit test by @jimthompson5802 in #3596
- fix: Move target tensor to model output device in
check_module_parameters_updated
by @jeffkinnison in #3567 - Allow user to specify huggingface link or local path to pretrained lora weights by @Infernaught in #3572
Full Changelog: v0.8.2...v0.8.3