🤖 Model 🖼️

fofr/nsfw-model-comparison
Detect NSFW content in images and compare results across two classifiers. Accepts an image and returns JSON with falcon_...
Found 3 models (showing 1-3)
Detect NSFW content in images and compare results across two classifiers. Accepts an image and returns JSON with falcon_...
Moderate images for safety and policy compliance. Accepts an image (optional custom prompt) and returns structured JSON...
Classify images for safety violations across sexually_explicit, dangerous_content, and violence_gore policies. Takes an...