fofr/flux-dev-layers 🔢📝❓ → 🖼️

▶️ 8.5K runs 📅 Sep 2024 ⚙️ Cog 0.9.12 🔗 GitHub ⚖️ License
flux layer-patching model-interpretability text-to-image

About

Explore how Flux Dev responds when you change the strengths of layers in the model. See readme for examples of how to select layers.

Example Output

Prompt:

"a closeup portrait photo"

Output

Example output

Performance Metrics

28.15s Prediction Time
28.16s Total Time
All Input Parameters
{
  "prompt": "a closeup portrait photo",
  "max_shift": 1.15,
  "base_shift": 0.5,
  "num_outputs": 1,
  "aspect_ratio": "3:4",
  "output_format": "webp",
  "guidance_scale": 3,
  "output_quality": 95,
  "num_inference_steps": 28,
  "flux_layers_to_patch": "attn=1.1"
}
Input Parameters
seed Type: integer
Set a seed for reproducibility. Random by default.
prompt (required) Type: string
Prompt for generated image. If you include the `trigger_word` used in the training process you are more likely to activate the trained object, style, or concept in the resulting image.
sampler Default: euler
Sampler
max_shift Type: numberDefault: 1.15Range: 0 - 10
Maximum shift
scheduler Default: simple
Scheduler
base_shift Type: numberDefault: 0.5Range: 0 - 10
Base shift
num_outputs Type: integerDefault: 1Range: 1 - 4
Number of images to output.
aspect_ratio Default: 1:1
Aspect ratio for the generated image in text-to-image mode. The size will always be 1 megapixel, i.e. 1024x1024 if aspect ratio is 1:1.
output_format Default: webp
Format of the output images
guidance_scale Type: numberDefault: 3Range: 0 - 10
Guidance scale for the diffusion process. Lower values can give more realistic images. Good values to try are 2, 2.5, 3 and 3.5
output_quality Type: integerDefault: 95Range: 0 - 100
Quality of the output images, from 0 to 100. 100 is best quality, 0 is lowest quality.
num_inference_steps Type: integerDefault: 28Range: 1 - 50
Number of inference steps. More steps can give more detailed images, but take longer.
flux_layers_to_patch Type: stringDefault:
Flux Dev layers to patch. A new line separated list of layers with values or a regular expression matching multiple layers, for example: 'double_blocks.0.img_mod.lin.weight=1.01' or 'attn=1.01'. See readme for examples.
Output Schema

Output

Type: arrayItems Type: stringItems Format: uri

Example Execution Logs
Random seed set to: 2560102052
Running workflow
got prompt
Executing node 99, title: 🔧 Flux Model Blocks Buster, class type: FluxBlocksBuster+
Activated layers: diffusion_model.double_blocks.0.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.0.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.0.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.0.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.0.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.0.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.0.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.0.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.0.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.0.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.0.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.0.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.1.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.1.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.1.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.1.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.1.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.1.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.1.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.1.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.1.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.1.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.1.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.1.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.2.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.2.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.2.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.2.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.2.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.2.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.2.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.2.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.2.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.2.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.2.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.2.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.3.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.3.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.3.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.3.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.3.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.3.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.3.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.3.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.3.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.3.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.3.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.3.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.4.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.4.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.4.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.4.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.4.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.4.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.4.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.4.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.4.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.4.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.4.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.4.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.5.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.5.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.5.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.5.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.5.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.5.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.5.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.5.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.5.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.5.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.5.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.5.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.6.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.6.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.6.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.6.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.6.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.6.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.6.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.6.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.6.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.6.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.6.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.6.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.7.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.7.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.7.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.7.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.7.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.7.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.7.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.7.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.7.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.7.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.7.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.7.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.8.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.8.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.8.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.8.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.8.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.8.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.8.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.8.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.8.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.8.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.8.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.8.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.9.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.9.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.9.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.9.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.9.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.9.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.9.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.9.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.9.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.9.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.9.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.9.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.10.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.10.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.10.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.10.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.10.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.10.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.10.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.10.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.10.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.10.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.10.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.10.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.11.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.11.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.11.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.11.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.11.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.11.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.11.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.11.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.11.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.11.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.11.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.11.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.12.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.12.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.12.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.12.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.12.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.12.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.12.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.12.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.12.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.12.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.12.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.12.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.13.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.13.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.13.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.13.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.13.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.13.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.13.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.13.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.13.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.13.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.13.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.13.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.14.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.14.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.14.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.14.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.14.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.14.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.14.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.14.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.14.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.14.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.14.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.14.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.15.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.15.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.15.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.15.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.15.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.15.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.15.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.15.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.15.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.15.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.15.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.15.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.16.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.16.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.16.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.16.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.16.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.16.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.16.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.16.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.16.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.16.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.16.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.16.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.17.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.17.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.17.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.17.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.17.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.17.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.17.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.17.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.17.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.17.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.17.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.17.txt_attn.proj.bias: 1.1
diffusion_model.double_blocks.18.img_attn.qkv.weight: 1.1
diffusion_model.double_blocks.18.img_attn.qkv.bias: 1.1
diffusion_model.double_blocks.18.img_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.18.img_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.18.img_attn.proj.weight: 1.1
diffusion_model.double_blocks.18.img_attn.proj.bias: 1.1
diffusion_model.double_blocks.18.txt_attn.qkv.weight: 1.1
diffusion_model.double_blocks.18.txt_attn.qkv.bias: 1.1
diffusion_model.double_blocks.18.txt_attn.norm.query_norm.scale: 1.1
diffusion_model.double_blocks.18.txt_attn.norm.key_norm.scale: 1.1
diffusion_model.double_blocks.18.txt_attn.proj.weight: 1.1
diffusion_model.double_blocks.18.txt_attn.proj.bias: 1.1
Executing node 104, title: 🔧 Console Debug, class type: ConsoleDebug+
Executing node 25, title: RandomNoise, class type: RandomNoise
Executing node 61, title: ModelSamplingFlux, class type: ModelSamplingFlux
Executing node 6, title: CLIP Text Encode (Prompt), class type: CLIPTextEncode
Executing node 60, title: FluxGuidance, class type: FluxGuidance
Executing node 22, title: BasicGuider, class type: BasicGuider
Executing node 17, title: BasicScheduler, class type: BasicScheduler
Executing node 5, title: Empty Latent Image, class type: EmptyLatentImage
Executing node 13, title: SamplerCustomAdvanced, class type: SamplerCustomAdvanced
Requested to load Flux
Loading 1 new model
loaded completely 0.0 22700.097778320312 True
  0%|          | 0/28 [00:00<?, ?it/s]
  4%|▎         | 1/28 [00:00<00:05,  4.66it/s]
  7%|▋         | 2/28 [00:00<00:08,  3.00it/s]
 11%|█         | 3/28 [00:01<00:09,  2.69it/s]
 14%|█▍        | 4/28 [00:01<00:09,  2.56it/s]
 18%|█▊        | 5/28 [00:01<00:09,  2.50it/s]
 21%|██▏       | 6/28 [00:02<00:08,  2.46it/s]
 25%|██▌       | 7/28 [00:02<00:08,  2.44it/s]
 29%|██▊       | 8/28 [00:03<00:08,  2.42it/s]
 32%|███▏      | 9/28 [00:03<00:07,  2.41it/s]
 36%|███▌      | 10/28 [00:03<00:07,  2.40it/s]
 39%|███▉      | 11/28 [00:04<00:07,  2.40it/s]
 43%|████▎     | 12/28 [00:04<00:06,  2.39it/s]
 46%|████▋     | 13/28 [00:05<00:06,  2.39it/s]
 50%|█████     | 14/28 [00:05<00:05,  2.39it/s]
 54%|█████▎    | 15/28 [00:06<00:05,  2.39it/s]
 57%|█████▋    | 16/28 [00:06<00:05,  2.39it/s]
 61%|██████    | 17/28 [00:06<00:04,  2.38it/s]
 64%|██████▍   | 18/28 [00:07<00:04,  2.38it/s]
 68%|██████▊   | 19/28 [00:07<00:03,  2.38it/s]
 71%|███████▏  | 20/28 [00:08<00:03,  2.38it/s]
 75%|███████▌  | 21/28 [00:08<00:02,  2.38it/s]
 79%|███████▊  | 22/28 [00:09<00:02,  2.38it/s]
 82%|████████▏ | 23/28 [00:09<00:02,  2.38it/s]
 86%|████████▌ | 24/28 [00:09<00:01,  2.38it/s]
 89%|████████▉ | 25/28 [00:10<00:01,  2.38it/s]
 93%|█████████▎| 26/28 [00:10<00:00,  2.38it/s]
 96%|█████████▋| 27/28 [00:11<00:00,  2.38it/s]
100%|██████████| 28/28 [00:11<00:00,  2.38it/s]
100%|██████████| 28/28 [00:11<00:00,  2.43it/s]
Executing node 8, title: VAE Decode, class type: VAEDecode
Executing node 9, title: Save Image, class type: SaveImage
Prompt executed in 27.56 seconds
outputs:  {'9': {'images': [{'filename': 'R8_flux_blocks_00001_.png', 'subfolder': '', 'type': 'output'}]}}
====================================
R8_flux_blocks_00001_.png
Version Details
Version ID
f242c110a76fd5f620d79089351c9f909b237d46c1219d46206b98a0244d82a2
Version Created
September 20, 2024
Run on Replicate →