dribnet/pixray-pixel ❓📝 → 🖼️
About

Example Output
Output
[object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object][object Object]
Performance Metrics
475.07s
Total Time
All Input Parameters
{ "aspect": "widescreen", "drawer": "pixel", "prompts": "computer love. #pixelart" }
Input Parameters
- aspect
- wide vs square
- drawer
- render engine
- prompts
- text promps
Output Schema
Output
Example Execution Logs
---> BasePixrayPredictor Predict Using seed: 10291670960213269173 Running pixeldrawer with 80x45 grid Using device: cuda:0 Optimising using: Adam Using text prompts: ['computer love. #pixelart'] 0it [00:00, ?it/s] /root/.pyenv/versions/3.8.12/lib/python3.8/site-packages/torch/nn/functional.py:3609: UserWarning: Default upsampling behavior when mode=bilinear is changed to align_corners=False since 0.4.0. Please specify align_corners=True if the old behavior is desired. See the documentation of nn.Upsample for details. warnings.warn( iter: 0, loss: 2.89, losses: 0.958, 0.0798, 0.878, 0.0475, 0.88, 0.0492 (-0=>2.892) 0it [00:00, ?it/s] 0it [00:12, ?it/s] 0it [00:00, ?it/s] iter: 10, loss: 2.7, losses: 0.909, 0.077, 0.81, 0.0498, 0.811, 0.0473 (-0=>2.704) 0it [00:00, ?it/s] 0it [00:12, ?it/s] 0it [00:00, ?it/s] iter: 20, loss: 2.65, losses: 0.9, 0.0796, 0.779, 0.048, 0.793, 0.0467 (-5=>2.615) 0it [00:00, ?it/s] 0it [00:12, ?it/s] 0it [00:00, ?it/s] iter: 30, loss: 2.54, losses: 0.878, 0.0804, 0.734, 0.0496, 0.747, 0.0461 (-4=>2.527) 0it [00:00, ?it/s] 0it [00:12, ?it/s] 0it [00:00, ?it/s] iter: 40, loss: 2.36, losses: 0.819, 0.0808, 0.68, 0.0522, 0.68, 0.0491 (-0=>2.361) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 50, loss: 2.39, losses: 0.833, 0.0826, 0.69, 0.0507, 0.688, 0.048 (-3=>2.3) 0it [00:00, ?it/s] 0it [00:12, ?it/s] 0it [00:00, ?it/s] iter: 60, loss: 2.27, losses: 0.781, 0.0831, 0.659, 0.0529, 0.642, 0.0495 (-1=>2.261) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 70, loss: 2.4, losses: 0.83, 0.0838, 0.695, 0.0513, 0.689, 0.048 (-3=>2.238) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 80, loss: 2.22, losses: 0.762, 0.0834, 0.646, 0.0533, 0.628, 0.0495 (-0=>2.222) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 90, loss: 2.32, losses: 0.81, 0.0846, 0.671, 0.052, 0.659, 0.049 (-1=>2.192) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 100, loss: 2.33, losses: 0.814, 0.0833, 0.67, 0.052, 0.662, 0.0484 (-3=>2.189) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 110, loss: 2.29, losses: 0.791, 0.0844, 0.664, 0.0527, 0.652, 0.0496 (-3=>2.167) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 120, loss: 2.18, losses: 0.747, 0.0832, 0.633, 0.0539, 0.611, 0.0508 (-13=>2.167) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 130, loss: 2.3, losses: 0.796, 0.0839, 0.671, 0.053, 0.651, 0.0494 (-23=>2.167) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 140, loss: 2.29, losses: 0.797, 0.084, 0.659, 0.0529, 0.644, 0.0493 (-33=>2.167) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 150, loss: 2.32, losses: 0.802, 0.0833, 0.672, 0.0534, 0.659, 0.0493 (-3=>2.159) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 160, loss: 2.31, losses: 0.796, 0.083, 0.667, 0.0527, 0.658, 0.0489 (-13=>2.159) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 170, loss: 2.25, losses: 0.777, 0.0849, 0.649, 0.0543, 0.635, 0.05 (-23=>2.159) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 180, loss: 2.17, losses: 0.751, 0.0836, 0.628, 0.0547, 0.605, 0.051 (-33=>2.159) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 190, loss: 2.18, losses: 0.751, 0.0833, 0.636, 0.0535, 0.61, 0.0507 (-9=>2.152) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 200, loss: 2.27, losses: 0.785, 0.083, 0.657, 0.053, 0.646, 0.0492 (-19=>2.152) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 210, loss: 2.29, losses: 0.79, 0.0839, 0.663, 0.0534, 0.653, 0.0493 (-1=>2.137) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 220, loss: 2.25, losses: 0.775, 0.0847, 0.648, 0.0546, 0.637, 0.0497 (-5=>2.135) 0it [00:00, ?it/s] Dropping learning rate 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 230, loss: 2.23, losses: 0.767, 0.0848, 0.647, 0.0546, 0.627, 0.0505 (-4=>2.167) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 240, loss: 2.3, losses: 0.798, 0.0824, 0.662, 0.053, 0.653, 0.0489 (-1=>2.133) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 250, loss: 2.26, losses: 0.78, 0.0838, 0.651, 0.0544, 0.639, 0.0495 (-1=>2.131) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 260, loss: 2.22, losses: 0.764, 0.0849, 0.64, 0.055, 0.622, 0.0505 (-9=>2.122) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 270, loss: 2.28, losses: 0.791, 0.083, 0.659, 0.0542, 0.646, 0.0495 (-3=>2.118) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 280, loss: 2.25, losses: 0.773, 0.0848, 0.654, 0.0543, 0.638, 0.0506 (-13=>2.118) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 290, loss: 2.25, losses: 0.779, 0.085, 0.646, 0.0544, 0.632, 0.0496 (-23=>2.118) 0it [00:00, ?it/s] 0it [00:13, ?it/s] 0it [00:00, ?it/s] iter: 300, finished (-5=>2.117) 0it [00:00, ?it/s] 0it [00:00, ?it/s]
Version Details
- Version ID
1374803de2cb9b28089256384e118edead1dd94a788714def7d482e7f1225395
- Version Created
- October 27, 2022