Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Fixed Lora training perf issue | Volpeon | 2023-03-24 | 1 | -7/+8 |
| | |||||
* | Lora fix: Save config JSON, too | Volpeon | 2023-03-24 | 1 | -0/+3 |
| | |||||
* | Refactoring, fixed Lora training | Volpeon | 2023-03-24 | 4 | -65/+32 |
| | |||||
* | Bring back Perlin offset noise | Volpeon | 2023-03-23 | 1 | -1/+14 |
| | |||||
* | Update | Volpeon | 2023-03-23 | 4 | -17/+14 |
| | |||||
* | Fix | Volpeon | 2023-03-22 | 1 | -1/+1 |
| | |||||
* | Log DAdam/DAdan d | Volpeon | 2023-03-21 | 1 | -0/+14 |
| | |||||
* | Fixed SNR weighting, re-enabled xformers | Volpeon | 2023-03-21 | 2 | -21/+84 |
| | |||||
* | Restore min SNR | Volpeon | 2023-03-19 | 1 | -13/+12 |
| | |||||
* | New loss weighting from arxiv.org:2204.00227 | Volpeon | 2023-03-18 | 1 | -2/+5 |
| | |||||
* | Better SNR weighting | Volpeon | 2023-03-18 | 1 | -3/+3 |
| | |||||
* | Fixed snr weight calculation | Volpeon | 2023-03-17 | 1 | -1/+4 |
| | |||||
* | Fix loss=nan | Volpeon | 2023-03-17 | 1 | -2/+2 |
| | |||||
* | Test: https://arxiv.org/pdf/2303.09556.pdf | Volpeon | 2023-03-17 | 1 | -3/+8 |
| | |||||
* | Update | Volpeon | 2023-03-07 | 2 | -15/+16 |
| | |||||
* | Update | Volpeon | 2023-03-06 | 1 | -16/+0 |
| | |||||
* | Added Perlin noise to training | Volpeon | 2023-03-04 | 1 | -0/+17 |
| | |||||
* | Removed offset noise from training, added init offset to pipeline | Volpeon | 2023-03-03 | 1 | -10/+2 |
| | |||||
* | Implemented different noise offset | Volpeon | 2023-03-03 | 2 | -22/+10 |
| | |||||
* | Low freq noise with randomized strength | Volpeon | 2023-03-03 | 1 | -1/+8 |
| | |||||
* | Better low freq noise | Volpeon | 2023-03-02 | 1 | -1/+1 |
| | |||||
* | Changed low freq noise | Volpeon | 2023-03-01 | 1 | -23/+10 |
| | |||||
* | Update | Volpeon | 2023-03-01 | 4 | -28/+32 |
| | |||||
* | Fixed TI normalization order | Volpeon | 2023-02-21 | 3 | -15/+19 |
| | |||||
* | Fix | Volpeon | 2023-02-21 | 1 | -6/+3 |
| | |||||
* | Don't rely on Accelerate for gradient accumulation | Volpeon | 2023-02-21 | 2 | -30/+29 |
| | |||||
* | Embedding normalization: Ignore tensors with grad = 0 | Volpeon | 2023-02-21 | 2 | -6/+16 |
| | |||||
* | Update | Volpeon | 2023-02-18 | 1 | -7/+14 |
| | |||||
* | Added Lion optimizer | Volpeon | 2023-02-17 | 1 | -4/+5 |
| | |||||
* | Remove xformers, switch to Pytorch Nightly | Volpeon | 2023-02-17 | 4 | -9/+8 |
| | |||||
* | Fix | Volpeon | 2023-02-16 | 1 | -4/+2 |
| | |||||
* | Integrated WIP UniPC scheduler | Volpeon | 2023-02-16 | 1 | -8/+22 |
| | |||||
* | Update | Volpeon | 2023-02-15 | 1 | -1/+1 |
| | |||||
* | Made low-freq noise configurable | Volpeon | 2023-02-14 | 1 | -6/+11 |
| | |||||
* | Better noise generation during training: ↵ | Volpeon | 2023-02-13 | 1 | -0/+7 |
| | | | | https://www.crosslabs.org/blog/diffusion-with-offset-noise | ||||
* | Update | Volpeon | 2023-02-13 | 3 | -3/+3 |
| | |||||
* | Fixed Lora training | Volpeon | 2023-02-08 | 1 | -18/+5 |
| | |||||
* | Fix Lora memory usage | Volpeon | 2023-02-07 | 4 | -9/+3 |
| | |||||
* | Add Lora | Volpeon | 2023-02-07 | 4 | -37/+214 |
| | |||||
* | Restored LR finder | Volpeon | 2023-01-20 | 6 | -393/+82 |
| | |||||
* | Move Accelerator preparation into strategy | Volpeon | 2023-01-19 | 3 | -16/+48 |
| | |||||
* | Update | Volpeon | 2023-01-17 | 4 | -14/+19 |
| | |||||
* | Fix | Volpeon | 2023-01-17 | 1 | -4/+5 |
| | |||||
* | Fix | Volpeon | 2023-01-17 | 1 | -1/+0 |
| | |||||
* | Make embedding decay work like Adam decay | Volpeon | 2023-01-17 | 1 | -9/+5 |
| | |||||
* | Update | Volpeon | 2023-01-17 | 2 | -8/+8 |
| | |||||
* | Update | Volpeon | 2023-01-17 | 4 | -21/+38 |
| | |||||
* | Training update | Volpeon | 2023-01-16 | 3 | -12/+15 |
| | |||||
* | Moved multi-TI code from Dreambooth to TI script | Volpeon | 2023-01-16 | 1 | -3/+14 |
| | |||||
* | More training adjustments | Volpeon | 2023-01-16 | 3 | -8/+9 |
| |