jensbosseparra/flux1-schnell-multi-lora 🔢🖼️📝❓✓ → 🖼️
About
Adapted to have multi-lora support also for schnell: https://replicate.com/lucataco/flux-dev-multi-lora
Example Output
Prompt:
"a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y"
Output
Performance Metrics
6.32s
Prediction Time
6.32s
Total Time
All Input Parameters
{
"seed": 43,
"prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y",
"hf_loras": [
"https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors",
"https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"
],
"lora_scales": [
0.8,
0.9
],
"num_outputs": 1,
"aspect_ratio": "1:1",
"output_format": "webp",
"guidance_scale": 0,
"output_quality": 100,
"prompt_strength": 0.8,
"num_inference_steps": 4
}
Input Parameters
- seed
- Random seed. Set for reproducible generation
- image
- Input image for image to image mode. The aspect ratio of your output will match this image
- prompt (required)
- Prompt for generated image
- hf_loras
- Huggingface path, or URL to the LoRA weights. Ex: alvdansen/frosting_lane_flux
- lora_scales
- Scale for the LoRA weights. Default value is 0.8
- num_outputs
- Number of images to output.
- aspect_ratio
- Aspect ratio & resolution for the generated image. The only low res option is 512:512.
- output_format
- Format of the output images
- guidance_scale
- Guidance scale for the diffusion process
- output_quality
- Quality when saving the output images, from 0 to 100. 100 is best quality, 0 is lowest quality. Not relevant for .png outputs
- prompt_strength
- Prompt strength (or denoising strength) when using image to image. 1.0 corresponds to full destruction of information in image.
- num_inference_steps
- Number of inference steps
- disable_safety_checker
- Disable safety checker for generated images. This feature is only available through the API. See [https://replicate.com/docs/how-does-replicate-work#safety](https://replicate.com/docs/how-does-replicate-work#safety)
Output Schema
Output
Example Execution Logs
Using seed: 43 Prompt: a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y txt2img mode Downloading LoRA weights from - HF URL: https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors HuggingFace slug from URL: Octree/flux-schnell-lora, weight name: flux-schnell-lora.safetensors Loading LoRA took: 0.68 seconds Downloading LoRA weights from - HF URL: https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors HuggingFace slug from URL: hugovntr/flux-schnell-realism, weight name: schnell-realism_v1.safetensors Unsuppored keys for ai-toolkit: dict_keys(['lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight']) Loading LoRA took: 2.27 seconds 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 1.85it/s] 50%|█████ | 2/4 [00:00<00:00, 2.45it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.26it/s] 100%|██████████| 4/4 [00:01<00:00, 2.18it/s] 100%|██████████| 4/4 [00:01<00:00, 2.19it/s]
Version Details
- Version ID
85e4655b8b7d00ee7b66e2ff6986a489f710ff799bb4a62a7d7b34e66a3bdcd2- Version Created
- January 28, 2025