jensbosseparra/flux1-schnell-multi-lora 🔢🖼️📝❓✓ → 🖼️

▶️ 2.7K runs 📅 Nov 2024 ⚙️ Cog 0.12.0
image-to-image text-to-image

About

Adapted to have multi-lora support also for schnell: https://replicate.com/lucataco/flux-dev-multi-lora

Example Output

Prompt:

"a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y"

Output

Example output

Performance Metrics

6.32s Prediction Time
6.32s Total Time
All Input Parameters
{
  "seed": 43,
  "prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y",
  "hf_loras": [
    "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors",
    "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"
  ],
  "lora_scales": [
    0.8,
    0.9
  ],
  "num_outputs": 1,
  "aspect_ratio": "1:1",
  "output_format": "webp",
  "guidance_scale": 0,
  "output_quality": 100,
  "prompt_strength": 0.8,
  "num_inference_steps": 4
}
Input Parameters
seed Type: integer
Random seed. Set for reproducible generation
image Type: string
Input image for image to image mode. The aspect ratio of your output will match this image
prompt (required) Type: string
Prompt for generated image
hf_loras Type: array
Huggingface path, or URL to the LoRA weights. Ex: alvdansen/frosting_lane_flux
lora_scales Type: array
Scale for the LoRA weights. Default value is 0.8
num_outputs Type: integerDefault: 1Range: 1 - 4
Number of images to output.
aspect_ratio Default: 512:512
Aspect ratio & resolution for the generated image. The only low res option is 512:512.
output_format Default: webp
Format of the output images
guidance_scale Type: numberDefault: 0Range: 0 - 0
Guidance scale for the diffusion process
output_quality Type: integerDefault: 100Range: 0 - 100
Quality when saving the output images, from 0 to 100. 100 is best quality, 0 is lowest quality. Not relevant for .png outputs
prompt_strength Type: numberDefault: 0.8Range: 0 - 1
Prompt strength (or denoising strength) when using image to image. 1.0 corresponds to full destruction of information in image.
num_inference_steps Type: integerDefault: 4Range: 1 - 4
Number of inference steps
disable_safety_checker Type: booleanDefault: false
Disable safety checker for generated images. This feature is only available through the API. See [https://replicate.com/docs/how-does-replicate-work#safety](https://replicate.com/docs/how-does-replicate-work#safety)
Output Schema

Output

Type: arrayItems Type: stringItems Format: uri

Example Execution Logs
Using seed: 43
Prompt: a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y
txt2img mode
Downloading LoRA weights from - HF URL: https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors
HuggingFace slug from URL: Octree/flux-schnell-lora, weight name: flux-schnell-lora.safetensors
Loading LoRA took: 0.68 seconds
Downloading LoRA weights from - HF URL: https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors
HuggingFace slug from URL: hugovntr/flux-schnell-realism, weight name: schnell-realism_v1.safetensors
Unsuppored keys for ai-toolkit: dict_keys(['lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight'])
Loading LoRA took: 2.27 seconds
  0%|          | 0/4 [00:00<?, ?it/s]
 25%|██▌       | 1/4 [00:00<00:01,  1.85it/s]
 50%|█████     | 2/4 [00:00<00:00,  2.45it/s]
 75%|███████▌  | 3/4 [00:01<00:00,  2.26it/s]
100%|██████████| 4/4 [00:01<00:00,  2.18it/s]
100%|██████████| 4/4 [00:01<00:00,  2.19it/s]
Version Details
Version ID
85e4655b8b7d00ee7b66e2ff6986a489f710ff799bb4a62a7d7b34e66a3bdcd2
Version Created
January 28, 2025
Run on Replicate →