bfirsh/vqgan-clip 🔢📝🖼️ → ❓
About
Generates images with VQGAN and CLIP
Example Output
Prompt:
"the first day of the waters"
Output
[object Object]
Performance Metrics
289.50s
Total Time
All Input Parameters
{
"prompt": "the first day of the waters",
"iterations": 250
}
Input Parameters
- cutn
- seed
- Random seed
- prompt
- Text prompt
- cut_pow
- step_size
- iterations
- Number of iterations
- image_prompt
- Image prompt
- initial_image
- Image to start with
- initial_weight
Output Schema
Example Execution Logs
i: 0, loss: 0.909588, losses: 0.909588 i: 10, loss: 0.847548, losses: 0.847548 i: 20, loss: 0.818597, losses: 0.818597 i: 30, loss: 0.808365, losses: 0.808365 i: 40, loss: 0.801111, losses: 0.801111 i: 50, loss: 0.791895, losses: 0.791895 i: 60, loss: 0.786057, losses: 0.786057 i: 70, loss: 0.774552, losses: 0.774552 i: 80, loss: 0.770548, losses: 0.770548 i: 90, loss: 0.763699, losses: 0.763699 i: 100, loss: 0.759811, losses: 0.759811 i: 110, loss: 0.752506, losses: 0.752506 i: 120, loss: 0.749852, losses: 0.749852 i: 130, loss: 0.749731, losses: 0.749731 i: 140, loss: 0.749962, losses: 0.749962 i: 150, loss: 0.747008, losses: 0.747008 i: 160, loss: 0.742415, losses: 0.742415 i: 170, loss: 0.744414, losses: 0.744414 i: 180, loss: 0.738426, losses: 0.738426 i: 190, loss: 0.73867, losses: 0.73867 i: 200, loss: 0.737181, losses: 0.737181 i: 210, loss: 0.736317, losses: 0.736317 i: 220, loss: 0.734543, losses: 0.734543 i: 230, loss: 0.729937, losses: 0.729937 i: 240, loss: 0.729107, losses: 0.729107 i: 249, loss: 0.733397, losses: 0.733397
Version Details
- Version ID
2c07b8ebd0d0803e671966e634363b8f7ab4849e6504aac48b9c60320b1eba1a- Version Created
- September 24, 2021