tuannha/instant-character ❓🔢📝🖼️ → 🖼️

▶️ 3.0K runs 📅 Apr 2025 ⚙️ Cog 0.14.3 🔗 GitHub ⚖️ License
image-consistent-character-generation image-to-image

About

Tencent Instant Character

Example Output

Prompt:

"a character in the library"

Output

Example output

Performance Metrics

19.85s Prediction Time
19.85s Total Time
All Input Parameters
{
  "lora": "ghibli_style",
  "seed": -1,
  "width": 768,
  "height": 1344,
  "prompt": "a character in the library",
  "subject_image": "https://replicate.delivery/pbxt/Mr9015ajTj5WrUEuW6YyWbxxRifCNCo0SbiA0L0F4vqjQmXS/face.png",
  "guidance_scale": 3.5,
  "num_inference_steps": 28
}
Input Parameters
lora Default: none
Lora to use
seed Type: integerDefault: -1
Seed for the random number generator, -1 for random
width Type: integerDefault: 512Range: 512 - 2048
Width of the output image
height Type: integerDefault: 512Range: 512 - 2048
Height of the output image
prompt (required) Type: string
Prompt for the image generation
subject_image (required) Type: string
Grayscale input image
guidance_scale Type: numberDefault: 3.5
Guidance scale for the diffusion process
num_inference_steps Type: integerDefault: 28
Number of diffusion steps
Output Schema

Output

Type: stringFormat: uri

Example Execution Logs
loading lora in transformer ...:   0%|          | 0/988 [00:00<?, ?it/s]
loading lora in transformer ...:  53%|█████▎    | 528/988 [00:00<00:00, 5258.64it/s]
loading lora in transformer ...: 100%|██████████| 988/988 [00:00<00:00, 3879.36it/s]
loading lora in text_encoder ...: 0it [00:00, ?it/s]
loading lora in text_encoder ...: 0it [00:00, ?it/s]
  0%|          | 0/28 [00:00<?, ?it/s]
  4%|▎         | 1/28 [00:00<00:17,  1.58it/s]
  7%|▋         | 2/28 [00:01<00:13,  1.87it/s]
 11%|█         | 3/28 [00:01<00:14,  1.73it/s]
 14%|█▍        | 4/28 [00:02<00:14,  1.67it/s]
 18%|█▊        | 5/28 [00:02<00:14,  1.63it/s]
 21%|██▏       | 6/28 [00:03<00:13,  1.61it/s]
 25%|██▌       | 7/28 [00:04<00:13,  1.60it/s]
 29%|██▊       | 8/28 [00:04<00:12,  1.59it/s]
 32%|███▏      | 9/28 [00:05<00:11,  1.59it/s]
 36%|███▌      | 10/28 [00:06<00:11,  1.58it/s]
 39%|███▉      | 11/28 [00:06<00:10,  1.58it/s]
 43%|████▎     | 12/28 [00:07<00:10,  1.58it/s]
 46%|████▋     | 13/28 [00:08<00:09,  1.58it/s]
 50%|█████     | 14/28 [00:08<00:08,  1.58it/s]
 54%|█████▎    | 15/28 [00:09<00:08,  1.57it/s]
 57%|█████▋    | 16/28 [00:09<00:07,  1.57it/s]
 61%|██████    | 17/28 [00:10<00:06,  1.57it/s]
 64%|██████▍   | 18/28 [00:11<00:06,  1.57it/s]
 68%|██████▊   | 19/28 [00:11<00:05,  1.57it/s]
 71%|███████▏  | 20/28 [00:12<00:05,  1.57it/s]
 75%|███████▌  | 21/28 [00:13<00:04,  1.57it/s]
 79%|███████▊  | 22/28 [00:13<00:03,  1.57it/s]
 82%|████████▏ | 23/28 [00:14<00:03,  1.57it/s]
 86%|████████▌ | 24/28 [00:15<00:02,  1.56it/s]
 89%|████████▉ | 25/28 [00:15<00:01,  1.56it/s]
 93%|█████████▎| 26/28 [00:16<00:01,  1.56it/s]
 96%|█████████▋| 27/28 [00:17<00:00,  1.56it/s]
100%|██████████| 28/28 [00:17<00:00,  1.56it/s]
100%|██████████| 28/28 [00:17<00:00,  1.59it/s]
loading lora in transformer ...:   0%|          | 0/988 [00:00<?, ?it/s]
loading lora in transformer ...:  65%|██████▍   | 640/988 [00:00<00:00, 6389.53it/s]
loading lora in transformer ...: 100%|██████████| 988/988 [00:00<00:00, 4527.79it/s]
loading lora in text_encoder ...: 0it [00:00, ?it/s]
loading lora in text_encoder ...: 0it [00:00, ?it/s]
Version Details
Version ID
df5eed34fa9c812acf62d3ca79874daf9b5e78c2bee11f4ada182a55dd5c1712
Version Created
April 18, 2025
Run on Replicate →