Commit message (Expand) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | TI: Bring back old embedding decay | Volpeon | 2023-04-04 | 1 | -5/+19 |
* | Improved sparse embeddings | Volpeon | 2023-04-03 | 1 | -1/+1 |
* | TI: Delta learning | Volpeon | 2023-04-03 | 1 | -26/+11 |
* | TI: No tag dropout by default | Volpeon | 2023-04-03 | 1 | -1/+1 |
* | Bring back Lion optimizer | Volpeon | 2023-04-03 | 1 | -3/+27 |
* | Update dataset format: Separate prompt and keywords | Volpeon | 2023-04-02 | 1 | -1/+1 |
* | Revert | Volpeon | 2023-04-01 | 1 | -6/+46 |
* | Combined TI with embedding and LoRA | Volpeon | 2023-04-01 | 1 | -25/+5 |
* | Experimental: TI via LoRA | Volpeon | 2023-04-01 | 1 | -22/+2 |
* | Update | Volpeon | 2023-04-01 | 1 | -1/+0 |
* | Add support for Adafactor, add TI initializer noise | Volpeon | 2023-04-01 | 1 | -2/+23 |
* | Update | Volpeon | 2023-03-31 | 1 | -1/+3 |
* | Update | Volpeon | 2023-03-31 | 1 | -0/+7 |
* | Fix | Volpeon | 2023-03-31 | 1 | -1/+2 |
* | Fix | Volpeon | 2023-03-31 | 1 | -1/+1 |
* | Support Dadaptation d0, adjust sample freq when steps instead of epochs are used | Volpeon | 2023-03-31 | 1 | -4/+11 |
* | Fix | Volpeon | 2023-03-31 | 1 | -1/+2 |
* | Fix | Volpeon | 2023-03-28 | 1 | -1/+1 |
* | Support num_train_steps arg again | Volpeon | 2023-03-28 | 1 | -9/+21 |
* | Fix TI | Volpeon | 2023-03-27 | 1 | -8/+8 |
* | Fix TI | Volpeon | 2023-03-27 | 1 | -1/+10 |
* | Fix TI | Volpeon | 2023-03-27 | 1 | -1/+1 |
* | Improved inverted tokens | Volpeon | 2023-03-26 | 1 | -1/+15 |
* | Update | Volpeon | 2023-03-25 | 1 | -4/+10 |
* | Update | Volpeon | 2023-03-24 | 1 | -3/+3 |
* | Bring back Perlin offset noise | Volpeon | 2023-03-23 | 1 | -0/+7 |
* | Update | Volpeon | 2023-03-23 | 1 | -1/+1 |
* | Fix | Volpeon | 2023-03-22 | 1 | -4/+0 |
* | Log DAdam/DAdan d | Volpeon | 2023-03-21 | 1 | -2/+2 |
* | Added dadaptation | Volpeon | 2023-03-21 | 1 | -0/+28 |
* | Fixed SNR weighting, re-enabled xformers | Volpeon | 2023-03-21 | 1 | -2/+2 |
* | Test: https://arxiv.org/pdf/2303.09556.pdf | Volpeon | 2023-03-17 | 1 | -12/+26 |
* | Update | Volpeon | 2023-03-07 | 1 | -4/+4 |
* | Pipeline: Perlin noise for init image | Volpeon | 2023-03-04 | 1 | -1/+1 |
* | Removed offset noise from training, added init offset to pipeline | Volpeon | 2023-03-03 | 1 | -1/+0 |
* | Implemented different noise offset | Volpeon | 2023-03-03 | 1 | -2/+2 |
* | Update | Volpeon | 2023-03-01 | 1 | -1/+1 |
* | Don't rely on Accelerate for gradient accumulation | Volpeon | 2023-02-21 | 1 | -1/+1 |
* | Embedding normalization: Ignore tensors with grad = 0 | Volpeon | 2023-02-21 | 1 | -11/+10 |
* | Update | Volpeon | 2023-02-18 | 1 | -2/+4 |
* | Added Lion optimizer | Volpeon | 2023-02-17 | 1 | -11/+27 |
* | Back to xformers | Volpeon | 2023-02-17 | 1 | -3/+2 |
* | Remove xformers, switch to Pytorch Nightly | Volpeon | 2023-02-17 | 1 | -2/+4 |
* | Integrated WIP UniPC scheduler | Volpeon | 2023-02-16 | 1 | -1/+2 |
* | Update | Volpeon | 2023-02-15 | 1 | -0/+1 |
* | Update | Volpeon | 2023-02-13 | 1 | -8/+11 |
* | Integrate Self-Attention-Guided (SAG) Stable Diffusion in my custom pipeline | Volpeon | 2023-02-08 | 1 | -1/+1 |
* | Fixed Lora training | Volpeon | 2023-02-08 | 1 | -6/+6 |
* | Add Lora | Volpeon | 2023-02-07 | 1 | -4/+6 |
* | Restored LR finder | Volpeon | 2023-01-20 | 1 | -2/+19 |