doctorpangloss
7074f3191d
Fix some relative path issues
2024-08-06 21:57:57 -07:00
comfyanonymous
b334605a66
Fix OOMs happening in some cases.
...
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
2024-08-06 13:36:04 -04:00
comfyanonymous
c14ac98fed
Unload models and load them back in lowvram mode no free vram.
2024-08-06 03:22:39 -04:00
comfyanonymous
2d75df45e6
Flux tweak memory usage.
2024-08-05 21:58:28 -04:00
doctorpangloss
8ab6b4b697
Fix running on CPU again
2024-08-05 17:20:28 -07:00
doctorpangloss
b00964ace4
Further modifications to paths
2024-08-05 17:06:18 -07:00
doctorpangloss
39c6335331
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-08-05 16:13:20 -07:00
doctorpangloss
2bc95c1711
Test improvements and fixes
...
- move workflows to distinct json files
- add the comfy-org workflows for testing
- fix issues where workflows from windows users would not be compatible
with backends running on linux or macos in light of separator
differences. Because this codebase uses get_or_download wherever
checkpoints, models, etc. are used, this is the only place where the
comparison is gracefully handled for downloading. Validation code
will correctly convert backslashes to forward slashes, assuming that
100% of the places they are used and when comparing with a list, they
are intended to be paths and not strict symbols
2024-08-05 15:55:46 -07:00
comfyanonymous
8edbcf5209
Improve performance on some lowend GPUs.
2024-08-05 16:24:04 -04:00
a-One-Fan
a178e25912
Fix Flux FP64 math on XPU ( #4210 )
2024-08-05 01:26:20 -04:00
comfyanonymous
78e133d041
Support simple diffusers Flux loras.
2024-08-04 22:05:48 -04:00
Silver
7afa985fba
Correct spelling 'token_weight_pars_t5' to 'token_weight_pairs_t5' ( #4200 )
2024-08-04 17:10:02 -04:00
comfyanonymous
3b71f84b50
ONNX tracing fixes.
2024-08-04 15:45:43 -04:00
comfyanonymous
0a6b008117
Fix issue with some custom nodes.
2024-08-04 10:03:33 -04:00
comfyanonymous
f7a5107784
Fix crash.
2024-08-03 16:55:38 -04:00
comfyanonymous
91be9c2867
Tweak lowvram memory formula.
2024-08-03 16:44:50 -04:00
comfyanonymous
03c5018c98
Lower lowvram memory to 1/3 of free memory.
2024-08-03 15:14:07 -04:00
comfyanonymous
2ba5cc8b86
Fix some issues.
2024-08-03 15:06:40 -04:00
comfyanonymous
1e68002b87
Cap lowvram to half of free memory.
2024-08-03 14:50:20 -04:00
comfyanonymous
ba9095e5bd
Automatically use fp8 for diffusion model weights if:
...
Checkpoint contains weights in fp8.
There isn't enough memory to load the diffusion model in GPU vram.
2024-08-03 13:45:19 -04:00
comfyanonymous
f123328b82
Load T5 in fp8 if it's in fp8 in the Flux checkpoint.
2024-08-03 12:39:33 -04:00
comfyanonymous
63a7e8edba
More aggressive batch splitting.
2024-08-03 11:53:30 -04:00
comfyanonymous
ea03c9dcd2
Better per model memory usage estimations.
2024-08-02 18:09:24 -04:00
comfyanonymous
3a9ee995cf
Tweak regular SD memory formula.
2024-08-02 17:34:30 -04:00
comfyanonymous
47da42d928
Better Flux vram estimation.
2024-08-02 17:02:35 -04:00
doctorpangloss
c348b37b7c
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-08-02 10:55:02 -07:00
Benjamin Berman
c6186bd97e
Fix pylint error
2024-08-01 21:27:16 -07:00
Alexander Brown
ce9ac2fe05
Fix clip_g/clip_l mixup ( #4168 )
2024-08-01 21:40:56 -04:00
comfyanonymous
e638f2858a
Hack to make all resolutions work on Flux models.
2024-08-01 21:39:18 -04:00
doctorpangloss
d9ba795385
Fixes for tests and completing merge
...
- huggingface cache is now better used on platforms that support
symlinking and the files you are requesting already exist in the
cache
- absolute imports were changed to relative in the correct places
- StringEnumRequestParameter has a special case in validation
- fix model_management whitespace issue
- fix comfy.ops references
2024-08-01 18:28:51 -07:00
doctorpangloss
a44a039661
Fix pylint
2024-08-01 16:28:24 -07:00
doctorpangloss
0a1ae64b0b
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-08-01 16:19:11 -07:00
comfyanonymous
d420bc792a
Tweak the memory usage formulas for Flux and SD.
2024-08-01 17:53:45 -04:00
comfyanonymous
d965474aaa
Make ComfyUI split batches a higher priority than weight offload.
2024-08-01 16:39:59 -04:00
comfyanonymous
1c61361fd2
Fast preview support for Flux.
2024-08-01 16:28:11 -04:00
comfyanonymous
a6decf1e62
Fix bfloat16 potentially not being enabled on mps.
2024-08-01 16:18:44 -04:00
comfyanonymous
48eb1399c0
Try to fix mac issue.
2024-08-01 13:41:27 -04:00
comfyanonymous
d7430a1651
Add a way to load the diffusion model in fp8 with UNETLoader node.
2024-08-01 13:30:51 -04:00
comfyanonymous
f2b80f95d2
Better Mac support on flux model.
2024-08-01 13:10:50 -04:00
comfyanonymous
1aa9cf3292
Make lowvram more aggressive on low memory machines.
2024-08-01 12:11:57 -04:00
comfyanonymous
eb96c3bd82
Fix .sft file loading (they are safetensors files).
2024-08-01 11:32:58 -04:00
comfyanonymous
5f98de7697
Load flux t5 in fp8 if weights are in fp8.
2024-08-01 11:05:56 -04:00
comfyanonymous
8d34211a7a
Fix old python versions no longer working.
2024-08-01 09:57:20 -04:00
comfyanonymous
1589b58d3e
Basic Flux Schnell and Flux Dev model implementation.
2024-08-01 09:49:29 -04:00
comfyanonymous
7ad574bffd
Mac supports bf16 just make sure you are using the latest pytorch.
2024-08-01 09:42:17 -04:00
comfyanonymous
e2382b6adb
Make lowvram less aggressive when there are large amounts of free memory.
2024-08-01 03:58:58 -04:00
doctorpangloss
603b8a59c8
Improve interaction with controlnet_aux
2024-07-30 23:11:04 -07:00
comfyanonymous
c24f897352
Fix to get fp8 working on T5 base.
2024-07-31 02:00:19 -04:00
comfyanonymous
a5991a7aa6
Fix hunyuan dit text encoder weights always being in fp32.
2024-07-31 01:34:57 -04:00
comfyanonymous
2c038ccef0
Lower CLIP memory usage by a bit.
2024-07-31 01:32:35 -04:00
comfyanonymous
b85216a3c0
Lower T5 memory usage by a few hundred MB.
2024-07-31 00:52:34 -04:00
doctorpangloss
01f2ecbfb1
Improve messaging
2024-07-30 17:10:06 -07:00
doctorpangloss
ce5fe01768
Improve performance and memory management of upscale models, improve messaging on models loaded and unloaded from the GPU
2024-07-30 17:05:53 -07:00
comfyanonymous
82cae45d44
Fix potential issue with non clip text embeddings.
2024-07-30 14:41:13 -04:00
doctorpangloss
a94cd0b626
Fix pylint issues
2024-07-30 11:40:03 -07:00
doctorpangloss
4e851e2cfa
Fix torch 2.4.0 fix
2024-07-30 11:17:32 -07:00
doctorpangloss
34522e0914
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-07-30 11:11:45 -07:00
comfyanonymous
25853d0be8
Use common function for casting weights to input.
2024-07-30 10:49:14 -04:00
comfyanonymous
79040635da
Remove unnecessary code.
2024-07-30 05:01:34 -04:00
comfyanonymous
66d35c07ce
Improve artifacts on hydit, auraflow and SD3 on specific resolutions.
...
This breaks seeds for resolutions that are not a multiple of 16 in pixel
resolution by using circular padding instead of reflection padding but
should lower the amount of artifacts when doing img2img at those
resolutions.
2024-07-29 20:48:50 -04:00
doctorpangloss
3dc2942119
README now documents REST API usage with examples. Also always enable Save (API)
2024-07-29 08:01:19 -07:00
comfyanonymous
4ba7fa0244
Refactor: Move sd2_clip.py to text_encoders folder.
2024-07-28 01:19:20 -04:00
comfyanonymous
cf4418b806
Don't treat Bert model like CLIP.
...
Bert can accept up to 512 tokens so any prompt with more than 77 should
just be passed to it as is instead of splitting it up like CLIP.
2024-07-26 13:08:12 -04:00
comfyanonymous
8328a2d8cd
Let hunyuan dit work with all prompt lengths.
2024-07-26 12:11:32 -04:00
comfyanonymous
afe732bef9
Hunyuan dit can now accept longer prompts.
2024-07-26 11:52:58 -04:00
comfyanonymous
a9ac56fc0d
Own BertModel implementation that works with lowvram.
2024-07-26 04:47:17 -04:00
comfyanonymous
25b51b1a8b
Hunyuan DiT lora support.
2024-07-25 22:42:54 -04:00
comfyanonymous
a5f4292f9f
Basic hunyuan dit implementation. ( #4102 )
...
* Let tokenizers return weights to be stored in the saved checkpoint.
* Basic hunyuan dit implementation.
* Fix some resolutions not working.
* Support hydit checkpoint save.
* Init with right dtype.
* Switch to optimized attention in pooler.
* Fix black images on hunyuan dit.
2024-07-25 18:21:08 -04:00
comfyanonymous
f87810cd3e
Let tokenizers return weights to be stored in the saved checkpoint.
2024-07-25 10:52:09 -04:00
comfyanonymous
10c919f4c7
Make it possible to load tokenizer data from checkpoints.
2024-07-24 16:43:53 -04:00
comfyanonymous
10b43ceea5
Remove duplicate code.
2024-07-24 01:12:59 -04:00
doctorpangloss
5c5e101ba3
Fix uv support and better protobuf spec
2024-07-23 17:03:25 -07:00
comfyanonymous
0a4c49c57c
Support MT5.
2024-07-23 15:35:28 -04:00
comfyanonymous
88ed893034
Allow SPieceTokenizer to load model from a byte string.
2024-07-23 14:17:42 -04:00
comfyanonymous
334ba48cea
More generic unet prefix detection code.
2024-07-23 14:13:32 -04:00
comfyanonymous
14764aa2e2
Rename LLAMATokenizer to SPieceTokenizer.
2024-07-22 12:21:45 -04:00
comfyanonymous
b2c995f623
"auto" type is only relevant to the SetUnionControlNetType node.
2024-07-22 11:30:38 -04:00
Chenlei Hu
4151fbfa8a
Add error message on union controlnet ( #4081 )
2024-07-22 11:27:32 -04:00
Benjamin Berman
a7a84fe747
Merge branch 'add-group-templating' of https://github.com/hku/ComfyUI into pr-2788
2024-07-21 14:37:12 -07:00
comfyanonymous
95fa9545f1
Only append zero to noise schedule if last sigma isn't zero.
2024-07-20 12:37:30 -04:00
Benjamin Berman
cf9fdc5feb
Traversable differences between python 3.10 and 3.11
2024-07-19 22:20:39 -07:00
doctorpangloss
87ab9d42d0
Merge branch 'execution_model_inversion' of github.com:guill/ComfyUI into pr-execution
2024-07-19 17:49:41 -07:00
comfyanonymous
6ab8cad22e
Implement beta sampling scheduler.
...
It is based on: https://arxiv.org/abs/2407.12173
Add "beta" to the list of schedulers and the BetaSamplingScheduler node.
2024-07-19 18:05:09 -04:00
doctorpangloss
499545c373
Improve requirements.txt for faster installation, improve validation error reporting
2024-07-19 09:16:18 -07:00
doctorpangloss
0c34c2b99d
Fix #13 audio nodes now work and test correctly
2024-07-18 17:15:44 -07:00
doctorpangloss
cc99d89ac6
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-07-18 16:31:21 -07:00
doctorpangloss
baf19e8563
Fix downloading URL files
2024-07-18 13:26:20 -07:00
doctorpangloss
5459cfa832
Improve model downloading, add FolderPaths object for custom nodes
2024-07-17 17:20:47 -07:00
喵哩个咪
855789403b
support clip-vit-large-patch14-336 ( #4042 )
...
* support clip-vit-large-patch14-336
* support clip-vit-large-patch14-336
2024-07-17 13:12:50 -04:00
comfyanonymous
6f7869f365
Get clip vision image size from config.
2024-07-17 13:05:38 -04:00
comfyanonymous
281ad42df4
Fix lowvram union controlnet bug.
2024-07-17 10:16:31 -04:00
doctorpangloss
d98c2c5456
Fix missing __init__.py
2024-07-16 15:30:59 -07:00
Thomas Ward
c5a48b15bd
Make default hash lib configurable without code changes via CLI argument ( #3947 )
...
* cli_args: Add --duplicate-check-hash-function.
* server.py: compare_image_hash configurable hash function
Uses an argument added in cli_args to specify the type of hashing to default to for duplicate hash checking. Uses an `eval()` to identify the specific hashlib class to utilize, but ultimately safely operates because we have specific options and only those options/choices in the arg parser. So we don't have any unsafe input there.
* Add hasher() to node_helpers
* hashlib selection moved to node_helpers
* default-hashing-function instead of dupe checking hasher
This makes a default-hashing-function option instead of previous selected option.
* Use args.default_hashing_function
* Use safer handling for node_helpers.hasher()
Uses a safer handling method than `eval` to evaluate default hashing function.
* Stray parentheses are evil.
* Indentation fix.
Somehow when I hit save I didn't notice I missed a space to make indentation work proper. Oops!
2024-07-16 18:27:09 -04:00
comfyanonymous
8270c62530
Add SetUnionControlNetType to set the type of the union controlnet model.
2024-07-16 17:04:53 -04:00
doctorpangloss
72baecad87
Improve logging and tracing for validation errors
2024-07-16 12:26:30 -07:00
comfyanonymous
821f93872e
Allow model sampling to set number of timesteps.
2024-07-16 15:18:40 -04:00
Chenlei Hu
99458e8aca
Add FrontendManager to manage non-default front-end impl ( #3897 )
...
* Add frontend manager
* Add tests
* nit
* Add unit test to github CI
* Fix path
* nit
* ignore
* Add logging
* Install test deps
* Remove 'stable' keyword support
* Update test
* Add web-root arg
* Rename web-root to front-end-root
* Add test on non-exist version number
* Use repo owner/name to replace hard coded provider list
* Inline cmd args
* nit
* Fix unit test
2024-07-16 11:26:11 -04:00
doctorpangloss
a20bf8134d
Fix AuraFlow
2024-07-15 15:29:49 -07:00
comfyanonymous
1305fb294c
Refactor: Move some code to the comfy/text_encoders folder.
2024-07-15 17:36:24 -04:00
doctorpangloss
3d1d833e6f
Merge branch 'master' of github.com:comfyanonymous/ComfyUI
2024-07-15 14:22:49 -07:00