Files
runpod/comfyui/workflows/image-to-video/i2v_wan22-14b-i2v.json
Sebastian Krüger 6efb55c59f
All checks were successful
Build and Push RunPod Docker Image / build-and-push (push) Successful in 15s
feat: add complete HunyuanVideo and Wan2.2 video generation integration
Integrated 35+ video generation models and 13 production workflows from ComfyUI docs tutorials for state-of-the-art text-to-video and image-to-video generation.

Models Added (models_huggingface.yaml):
- HunyuanVideo (5 models): Original T2V/I2V (720p), v1.5 (720p/1080p) with Qwen 2.5 VL
- Wan2.2 diffusion models (18 models):
  - 5B TI2V hybrid (8GB VRAM, efficient)
  - 14B variants: T2V, I2V (high/low noise), Animate, S2V (FP8/BF16), Fun Camera/Control (high/low noise)
- Support models (12): VAEs, UMT5-XXL, CLIP Vision H, Wav2Vec2, LLaVA encoders
- LoRA accelerators (4): Lightx2v 4-step distillation for 5x speedup

Workflows Added (comfyui/workflows/image-to-video/):
- HunyuanVideo (5 workflows): T2V original, I2V v1/v2 (webp embedded), v1.5 T2V/I2V (JSON)
- Wan2.2 (8 workflows): 5B TI2V, 14B T2V/I2V/FLF2V/Animate/S2V/Fun Camera/Fun Control
- Asset files (10): Reference images, videos, audio for workflow testing

Custom Nodes Added (arty.yml):
- ComfyUI-KJNodes: Kijai optimizations for HunyuanVideo/Wan2.2 (FP8 scaling, video helpers)
- comfyui_controlnet_aux: ControlNet preprocessors (Canny, Depth, OpenPose, MLSD) for Fun Control
- ComfyUI-GGUF: GGUF quantization support for memory optimization

VRAM Requirements:
- HunyuanVideo original: 24GB (720p T2V/I2V, 129 frames, 5s generation)
- HunyuanVideo 1.5: 30-60GB (720p/1080p, improved quality with Qwen 2.5 VL)
- Wan2.2 5B: 8GB (efficient dual-expert architecture with native offloading)
- Wan2.2 14B: 24GB (high-quality video generation, all modes)

Note: Wan2.2 Fun Inpaint workflow not available in official templates repository (404).

Tutorial Sources:
- https://docs.comfy.org/tutorials/video/hunyuan/hunyuan-video
- https://docs.comfy.org/tutorials/video/hunyuan/hunyuan-video-1-5
- https://docs.comfy.org/tutorials/video/wan/wan2_2
- https://docs.comfy.org/tutorials/video/wan/wan2-2-animate
- https://docs.comfy.org/tutorials/video/wan/wan2-2-s2v
- https://docs.comfy.org/tutorials/video/wan/wan2-2-fun-camera
- https://docs.comfy.org/tutorials/video/wan/wan2-2-fun-control

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-25 10:43:39 +01:00

2327 lines
46 KiB
JSON
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
{
"id": "ec7da562-7e21-4dac-a0d2-f4441e1efd3b",
"revision": 0,
"last_node_id": 115,
"last_link_id": 214,
"nodes": [
{
"id": 38,
"type": "CLIPLoader",
"pos": [
70,
1360
],
"size": [
346.391845703125,
106
],
"flags": {},
"order": 0,
"mode": 4,
"inputs": [],
"outputs": [
{
"name": "CLIP",
"type": "CLIP",
"slot_index": 0,
"links": [
74,
75
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPLoader",
"models": [
{
"name": "umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged/resolve/main/split_files/text_encoders/umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"directory": "text_encoders"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"wan",
"default"
]
},
{
"id": 7,
"type": "CLIPTextEncode",
"pos": [
450,
1580
],
"size": [
425.27801513671875,
180.6060791015625
],
"flags": {},
"order": 16,
"mode": 4,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 75
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
135
]
}
],
"title": "CLIP Text Encode (Negative Prompt)",
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPTextEncode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"色调艳丽过曝静态细节模糊不清字幕风格作品画作画面静止整体发灰最差质量低质量JPEG压缩残留丑陋的残缺的多余的手指画得不好的手部画得不好的脸部畸形的毁容的形态畸形的肢体手指融合静止不动的画面杂乱的背景三条腿背景人很多倒着走"
],
"color": "#322",
"bgcolor": "#533"
},
{
"id": 39,
"type": "VAELoader",
"pos": [
70,
1520
],
"size": [
344.731689453125,
59.98149108886719
],
"flags": {},
"order": 1,
"mode": 4,
"inputs": [],
"outputs": [
{
"name": "VAE",
"type": "VAE",
"slot_index": 0,
"links": [
76,
141
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "VAELoader",
"models": [
{
"name": "wan_2.1_vae.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/vae/wan_2.1_vae.safetensors",
"directory": "vae"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan_2.1_vae.safetensors"
]
},
{
"id": 54,
"type": "ModelSamplingSD3",
"pos": [
670,
1100
],
"size": [
210,
60
],
"flags": {},
"order": 17,
"mode": 4,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 110
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
147
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "ModelSamplingSD3",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
8.000000000000002
]
},
{
"id": 55,
"type": "ModelSamplingSD3",
"pos": [
670,
1230
],
"size": [
210,
58
],
"flags": {},
"order": 18,
"mode": 4,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 112
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
148
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "ModelSamplingSD3",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
8
]
},
{
"id": 6,
"type": "CLIPTextEncode",
"pos": [
450,
1380
],
"size": [
422.84503173828125,
164.31304931640625
],
"flags": {},
"order": 15,
"mode": 4,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 74
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
134
]
}
],
"title": "CLIP Text Encode (Positive Prompt)",
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPTextEncode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"The white dragon warrior stands still, eyes full of determination and strength. The camera slowly moves closer or circles around the warrior, highlighting the powerful presence and heroic spirit of the character."
],
"color": "#232",
"bgcolor": "#353"
},
{
"id": 37,
"type": "UNETLoader",
"pos": [
70,
1100
],
"size": [
346.7470703125,
82
],
"flags": {},
"order": 2,
"mode": 4,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
110
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "UNETLoader",
"models": [
{
"name": "wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"directory": "diffusion_models"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"default"
]
},
{
"id": 56,
"type": "UNETLoader",
"pos": [
70,
1230
],
"size": [
346.7470703125,
82
],
"flags": {},
"order": 3,
"mode": 4,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
112
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "UNETLoader",
"models": [
{
"name": "wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"directory": "diffusion_models"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"default"
]
},
{
"id": 63,
"type": "WanImageToVideo",
"pos": [
480,
1860
],
"size": [
342.5999755859375,
210
],
"flags": {},
"order": 23,
"mode": 4,
"inputs": [
{
"name": "positive",
"type": "CONDITIONING",
"link": 134
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 135
},
{
"name": "vae",
"type": "VAE",
"link": 141
},
{
"name": "clip_vision_output",
"shape": 7,
"type": "CLIP_VISION_OUTPUT",
"link": null
},
{
"name": "start_image",
"shape": 7,
"type": "IMAGE",
"link": 133
}
],
"outputs": [
{
"name": "positive",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
136,
138
]
},
{
"name": "negative",
"type": "CONDITIONING",
"slot_index": 1,
"links": [
137,
139
]
},
{
"name": "latent",
"type": "LATENT",
"slot_index": 2,
"links": [
140
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "WanImageToVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
640,
640,
81,
1
]
},
{
"id": 84,
"type": "CLIPLoader",
"pos": [
60,
30
],
"size": [
346.391845703125,
106
],
"flags": {},
"order": 4,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "CLIP",
"type": "CLIP",
"slot_index": 0,
"links": [
178,
181
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPLoader",
"models": [
{
"name": "umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged/resolve/main/split_files/text_encoders/umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"directory": "text_encoders"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"umt5_xxl_fp8_e4m3fn_scaled.safetensors",
"wan",
"default"
]
},
{
"id": 90,
"type": "VAELoader",
"pos": [
60,
190
],
"size": [
344.731689453125,
59.98149108886719
],
"flags": {},
"order": 5,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "VAE",
"type": "VAE",
"slot_index": 0,
"links": [
176,
185
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "VAELoader",
"models": [
{
"name": "wan_2.1_vae.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/vae/wan_2.1_vae.safetensors",
"directory": "vae"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan_2.1_vae.safetensors"
]
},
{
"id": 95,
"type": "UNETLoader",
"pos": [
50,
-230
],
"size": [
346.7470703125,
82
],
"flags": {},
"order": 6,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
194
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "UNETLoader",
"models": [
{
"name": "wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"directory": "diffusion_models"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors",
"default"
]
},
{
"id": 96,
"type": "UNETLoader",
"pos": [
50,
-100
],
"size": [
346.7470703125,
82
],
"flags": {},
"order": 7,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
196
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "UNETLoader",
"models": [
{
"name": "wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"directory": "diffusion_models"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors",
"default"
]
},
{
"id": 103,
"type": "ModelSamplingSD3",
"pos": [
740,
-100
],
"size": [
210,
58
],
"flags": {
"collapsed": false
},
"order": 26,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 189
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
192
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "ModelSamplingSD3",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
5.000000000000001
]
},
{
"id": 93,
"type": "CLIPTextEncode",
"pos": [
440,
90
],
"size": [
510,
160
],
"flags": {},
"order": 20,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 181
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
183
]
}
],
"title": "CLIP Text Encode (Positive Prompt)",
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPTextEncode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"The white dragon warrior stands still, eyes full of determination and strength. The camera slowly moves closer or circles around the warrior, highlighting the powerful presence and heroic spirit of the character."
],
"color": "#232",
"bgcolor": "#353"
},
{
"id": 89,
"type": "CLIPTextEncode",
"pos": [
440,
290
],
"size": [
510,
130
],
"flags": {},
"order": 19,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 178
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
184
]
}
],
"title": "CLIP Text Encode (Negative Prompt)",
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CLIPTextEncode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"色调艳丽过曝静态细节模糊不清字幕风格作品画作画面静止整体发灰最差质量低质量JPEG压缩残留丑陋的残缺的多余的手指画得不好的手部画得不好的脸部畸形的毁容的形态畸形的肢体手指融合静止不动的画面杂乱的背景三条腿背景人很多倒着走"
],
"color": "#322",
"bgcolor": "#533"
},
{
"id": 101,
"type": "LoraLoaderModelOnly",
"pos": [
450,
-230
],
"size": [
280,
82
],
"flags": {},
"order": 21,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 194
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"links": [
190
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.49",
"Node name for S&R": "LoraLoaderModelOnly",
"models": [
{
"name": "wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/loras/wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors",
"directory": "loras"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors",
1.0000000000000002
]
},
{
"id": 102,
"type": "LoraLoaderModelOnly",
"pos": [
450,
-100
],
"size": [
280,
82
],
"flags": {},
"order": 22,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 196
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"links": [
189
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.49",
"Node name for S&R": "LoraLoaderModelOnly",
"models": [
{
"name": "wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors",
"url": "https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/loras/wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors",
"directory": "loras"
}
],
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors",
1.0000000000000002
]
},
{
"id": 105,
"type": "MarkdownNote",
"pos": [
-470,
280
],
"size": [
480,
180
],
"flags": {},
"order": 8,
"mode": 0,
"inputs": [],
"outputs": [],
"title": "VRAM Usage",
"properties": {
"ue_properties": {
"version": "7.1",
"widget_ue_connectable": {},
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"## GPU:RTX4090D 24GB\n\n| Model | Size |VRAM Usage | 1st Generation | 2nd Generation |\n|---------------------|-------|-----------|---------------|-----------------|\n| fp8_scaled |640*640| 84% | ≈ 536s | ≈ 513s |\n| fp8_scaled + 4steps LoRA | 640*640 | 83% | ≈ 97s | ≈ 71s |"
],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 62,
"type": "LoadImage",
"pos": [
80,
1740
],
"size": [
315,
314
],
"flags": {},
"order": 9,
"mode": 4,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
133
]
},
{
"name": "MASK",
"type": "MASK",
"slot_index": 1,
"links": null
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "LoadImage",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"video_wan2_2_14B_i2v_input_image.jpg",
"image"
]
},
{
"id": 106,
"type": "MarkdownNote",
"pos": [
-350,
1010
],
"size": [
370,
110
],
"flags": {},
"order": 10,
"mode": 0,
"inputs": [],
"outputs": [],
"properties": {
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"1. Box-select then use Ctrl + B to enable\n2. If you don't want to run both groups simultaneously, don't forget to use **Ctrl + B** to disable the **fp8_scaled + 4steps LoRA** group after enabling the **fp8_scaled** group, or try the [partial - execution](https://docs.comfy.org/interface/features/partial-execution) feature."
],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 67,
"type": "Note",
"pos": [
510,
820
],
"size": [
390,
88
],
"flags": {},
"order": 11,
"mode": 0,
"inputs": [],
"outputs": [],
"title": "Video Size",
"properties": {
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"By default, we set the video to a smaller size for users with low VRAM. If you have enough VRAM, you can change the size"
],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 104,
"type": "ModelSamplingSD3",
"pos": [
740,
-230
],
"size": [
210,
60
],
"flags": {},
"order": 25,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 190
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
195
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "ModelSamplingSD3",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
5.000000000000001
]
},
{
"id": 98,
"type": "WanImageToVideo",
"pos": [
530,
530
],
"size": [
342.5999755859375,
210
],
"flags": {},
"order": 24,
"mode": 0,
"inputs": [
{
"name": "positive",
"type": "CONDITIONING",
"link": 183
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 184
},
{
"name": "vae",
"type": "VAE",
"link": 185
},
{
"name": "clip_vision_output",
"shape": 7,
"type": "CLIP_VISION_OUTPUT",
"link": null
},
{
"name": "start_image",
"shape": 7,
"type": "IMAGE",
"link": 186
}
],
"outputs": [
{
"name": "positive",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
168,
172
]
},
{
"name": "negative",
"type": "CONDITIONING",
"slot_index": 1,
"links": [
169,
173
]
},
{
"name": "latent",
"type": "LATENT",
"slot_index": 2,
"links": [
174
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "WanImageToVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
640,
640,
81,
1
]
},
{
"id": 94,
"type": "CreateVideo",
"pos": [
1350,
460
],
"size": [
270,
78
],
"flags": {},
"order": 34,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 182
},
{
"name": "audio",
"shape": 7,
"type": "AUDIO",
"link": null
}
],
"outputs": [
{
"name": "VIDEO",
"type": "VIDEO",
"links": [
197
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "CreateVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
16
]
},
{
"id": 87,
"type": "VAEDecode",
"pos": [
1060,
480
],
"size": [
210,
46
],
"flags": {},
"order": 32,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 175
},
{
"name": "vae",
"type": "VAE",
"link": 176
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
182
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "VAEDecode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": []
},
{
"id": 108,
"type": "SaveVideo",
"pos": [
1690,
-250
],
"size": [
890,
988
],
"flags": {},
"order": 36,
"mode": 0,
"inputs": [
{
"name": "video",
"type": "VIDEO",
"link": 197
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.49",
"Node name for S&R": "SaveVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"video/ComfyUI",
"auto",
"auto"
]
},
{
"id": 115,
"type": "Note",
"pos": [
30,
-470
],
"size": [
360,
100
],
"flags": {},
"order": 12,
"mode": 0,
"inputs": [],
"outputs": [],
"title": "About 4 Steps LoRA",
"properties": {
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"Using the Wan2.2 Lighting LoRA will result in the loss of video dynamics, but it will reduce the generation time. This template provides two workflows, and you can enable one as needed."
],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 86,
"type": "KSamplerAdvanced",
"pos": [
990,
-250
],
"size": [
304.748046875,
546
],
"flags": {},
"order": 28,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 195
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 172
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 173
},
{
"name": "latent_image",
"type": "LATENT",
"link": 174
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
170
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "KSamplerAdvanced",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"enable",
138073435077572,
"randomize",
4,
1,
"euler",
"simple",
0,
2,
"enable"
]
},
{
"id": 85,
"type": "KSamplerAdvanced",
"pos": [
1336.748046875,
-250
],
"size": [
304.748046875,
546
],
"flags": {},
"order": 30,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 192
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 168
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 169
},
{
"name": "latent_image",
"type": "LATENT",
"link": 170
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
175
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "KSamplerAdvanced",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"disable",
0,
"fixed",
4,
1,
"euler",
"simple",
2,
4,
"disable"
]
},
{
"id": 58,
"type": "KSamplerAdvanced",
"pos": [
1240,
1110
],
"size": [
304.748046875,
498.6905822753906
],
"flags": {},
"order": 29,
"mode": 4,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 148
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 138
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 139
},
{
"name": "latent_image",
"type": "LATENT",
"link": 113
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
124
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "KSamplerAdvanced",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"disable",
0,
"fixed",
20,
3.5,
"euler",
"simple",
10,
10000,
"disable"
]
},
{
"id": 109,
"type": "CreateVideo",
"pos": [
1250,
1740
],
"size": [
270,
78
],
"flags": {},
"order": 33,
"mode": 4,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 198
},
{
"name": "audio",
"shape": 7,
"type": "AUDIO",
"link": null
}
],
"outputs": [
{
"name": "VIDEO",
"type": "VIDEO",
"links": [
199
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.49",
"Node name for S&R": "CreateVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
16
]
},
{
"id": 8,
"type": "VAEDecode",
"pos": [
920,
1750
],
"size": [
210,
46
],
"flags": {},
"order": 31,
"mode": 4,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 124
},
{
"name": "vae",
"type": "VAE",
"link": 76
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
198
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "VAEDecode",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": []
},
{
"id": 57,
"type": "KSamplerAdvanced",
"pos": [
920,
1110
],
"size": [
310,
500
],
"flags": {},
"order": 27,
"mode": 4,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 147
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 136
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 137
},
{
"name": "latent_image",
"type": "LATENT",
"link": 140
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
113
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "KSamplerAdvanced",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"enable",
306189419888845,
"randomize",
20,
3.5,
"euler",
"simple",
0,
10,
"enable"
]
},
{
"id": 61,
"type": "SaveVideo",
"pos": [
1580,
1110
],
"size": [
990,
990
],
"flags": {},
"order": 35,
"mode": 4,
"inputs": [
{
"name": "video",
"type": "VIDEO",
"link": 199
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "SaveVideo",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"video/ComfyUI",
"auto",
"auto"
]
},
{
"id": 66,
"type": "MarkdownNote",
"pos": [
-470,
-320
],
"size": [
480,
530
],
"flags": {},
"order": 13,
"mode": 0,
"inputs": [],
"outputs": [],
"title": "Model Links",
"properties": {
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"[Tutorial](https://docs.comfy.org/tutorials/video/wan/wan2_2\n)\n\n**Diffusion Model**\n- [wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors](https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors)\n- [wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors](https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/diffusion_models/wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors)\n\n**LoRA**\n- [wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors](https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/loras/wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors)\n- [wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors](https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/loras/wan2.2_i2v_lightx2v_4steps_lora_v1_high_noise.safetensors)\n\n**VAE**\n- [wan_2.1_vae.safetensors](https://huggingface.co/Comfy-Org/Wan_2.2_ComfyUI_Repackaged/resolve/main/split_files/vae/wan_2.1_vae.safetensors)\n\n**Text Encoder** \n- [umt5_xxl_fp8_e4m3fn_scaled.safetensors](https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged/resolve/main/split_files/text_encoders/umt5_xxl_fp8_e4m3fn_scaled.safetensors)\n\n\nFile save location\n\n```\nComfyUI/\n├───📂 models/\n│ ├───📂 diffusion_models/\n│ │ ├─── wan2.2_i2v_low_noise_14B_fp8_scaled.safetensors\n│ │ └─── wan2.2_i2v_high_noise_14B_fp8_scaled.safetensors\n│ ├───📂 loras/\n│ │ ├─── wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors\n│ │ └─── wan2.2_i2v_lightx2v_4steps_lora_v1_low_noise.safetensors\n│ ├───📂 text_encoders/\n│ │ └─── umt5_xxl_fp8_e4m3fn_scaled.safetensors \n│ └───📂 vae/\n│ └── wan_2.1_vae.safetensors\n```\n"
],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 97,
"type": "LoadImage",
"pos": [
70,
400
],
"size": [
315,
314.0001220703125
],
"flags": {},
"order": 14,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
186
]
},
{
"name": "MASK",
"type": "MASK",
"slot_index": 1,
"links": null
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.45",
"Node name for S&R": "LoadImage",
"ue_properties": {
"widget_ue_connectable": {},
"version": "7.1",
"input_ue_unconnectable": {}
}
},
"widgets_values": [
"video_wan2_2_14B_i2v_input_image.jpg",
"image"
]
}
],
"links": [
[
74,
38,
0,
6,
0,
"CLIP"
],
[
75,
38,
0,
7,
0,
"CLIP"
],
[
76,
39,
0,
8,
1,
"VAE"
],
[
110,
37,
0,
54,
0,
"MODEL"
],
[
112,
56,
0,
55,
0,
"MODEL"
],
[
113,
57,
0,
58,
3,
"LATENT"
],
[
124,
58,
0,
8,
0,
"LATENT"
],
[
133,
62,
0,
63,
4,
"IMAGE"
],
[
134,
6,
0,
63,
0,
"CONDITIONING"
],
[
135,
7,
0,
63,
1,
"CONDITIONING"
],
[
136,
63,
0,
57,
1,
"CONDITIONING"
],
[
137,
63,
1,
57,
2,
"CONDITIONING"
],
[
138,
63,
0,
58,
1,
"CONDITIONING"
],
[
139,
63,
1,
58,
2,
"CONDITIONING"
],
[
140,
63,
2,
57,
3,
"LATENT"
],
[
141,
39,
0,
63,
2,
"VAE"
],
[
147,
54,
0,
57,
0,
"MODEL"
],
[
148,
55,
0,
58,
0,
"MODEL"
],
[
168,
98,
0,
85,
1,
"CONDITIONING"
],
[
169,
98,
1,
85,
2,
"CONDITIONING"
],
[
170,
86,
0,
85,
3,
"LATENT"
],
[
172,
98,
0,
86,
1,
"CONDITIONING"
],
[
173,
98,
1,
86,
2,
"CONDITIONING"
],
[
174,
98,
2,
86,
3,
"LATENT"
],
[
175,
85,
0,
87,
0,
"LATENT"
],
[
176,
90,
0,
87,
1,
"VAE"
],
[
178,
84,
0,
89,
0,
"CLIP"
],
[
181,
84,
0,
93,
0,
"CLIP"
],
[
182,
87,
0,
94,
0,
"IMAGE"
],
[
183,
93,
0,
98,
0,
"CONDITIONING"
],
[
184,
89,
0,
98,
1,
"CONDITIONING"
],
[
185,
90,
0,
98,
2,
"VAE"
],
[
186,
97,
0,
98,
4,
"IMAGE"
],
[
189,
102,
0,
103,
0,
"MODEL"
],
[
190,
101,
0,
104,
0,
"MODEL"
],
[
192,
103,
0,
85,
0,
"MODEL"
],
[
194,
95,
0,
101,
0,
"MODEL"
],
[
195,
104,
0,
86,
0,
"MODEL"
],
[
196,
96,
0,
102,
0,
"MODEL"
],
[
197,
94,
0,
108,
0,
"VIDEO"
],
[
198,
8,
0,
109,
0,
"IMAGE"
],
[
199,
109,
0,
61,
0,
"VIDEO"
]
],
"groups": [
{
"id": 10,
"title": "fp8_scaled",
"bounding": [
40,
980,
2570,
1150
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 1,
"title": "Step1 - Load models",
"bounding": [
50,
1020,
371.0310363769531,
571.3974609375
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 2,
"title": "Step2 - Upload start_image",
"bounding": [
50,
1620,
370,
470
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 3,
"title": "Step4 - Prompt",
"bounding": [
440,
1310,
445.27801513671875,
464.2060852050781
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 4,
"title": "Step3 - Video size & length",
"bounding": [
440,
1790,
440,
300
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 15,
"title": "fp8_scaled + 4steps LoRA",
"bounding": [
30,
-350,
2580,
1120
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 11,
"title": "Step1 - Load models",
"bounding": [
40,
-310,
371.0310363769531,
571.3974609375
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 12,
"title": "Step2 - Upload start_image",
"bounding": [
40,
280,
370,
470
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 13,
"title": "Step4 - Prompt",
"bounding": [
430,
20,
530,
420
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 14,
"title": "Step3 - Video size & length",
"bounding": [
430,
460,
530,
290
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 16,
"title": "Lightx2v 4steps LoRA",
"bounding": [
430,
-310,
530,
310
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
}
],
"config": {},
"extra": {
"ds": {
"scale": 0.45000000000000046,
"offset": [
1208.3999122160667,
781.2015842319031
]
},
"frontendVersion": "1.27.10",
"VHS_latentpreview": false,
"VHS_latentpreviewrate": 0,
"VHS_MetadataImage": true,
"VHS_KeepIntermediate": true,
"ue_links": []
},
"version": 0.4
}