
m1guelpf/nsfw-filter
Detect NSFW content in images. Analyze an input image with Stable Diffusion’s content filter and return a structured mod...
Found 10 models (showing 1-10)
Detect NSFW content in images. Analyze an input image with Stable Diffusion’s content filter and return a structured mod...
Detect NSFW content in images. Accepts an image and returns a binary label of "nsfw" or "normal" for content moderation...
Detect NSFW content in images for moderation. Accepts an image and returns category flags and confidence scores for nudi...
Moderate images and prompts for NSFW, public figures, and copyright risk. Accepts an image and/or text prompt and return...
Detect NSFW content in images and compare results across two classifiers. Accepts an image and returns JSON with falcon_...
Moderate images for safety and policy compliance. Accepts an image (optional custom prompt) and returns structured JSON...
Classify images for safety violations across sexually_explicit, dangerous_content, and violence_gore policies. Takes an...
Moderate images and accompanying user messages by classifying safety risks. Takes an image and optional text input; outp...
Detect NSFW content in images for content moderation. Takes a single image as input and returns a binary label (safe or...
Classify text prompts, model responses, and multiple images for safety policy compliance. Accepts text and a list of ima...