fofr/nsfw-model-comparison 🖼️ → ❓

▶️ 164 runs 📅 Sep 2024 ⚙️ Cog 0.9.14 🔗 GitHub
image-moderation image-nsfw-detection

About

Compare nsfw models against inputs

Example Output

Output

{"falcon_is_safe":true,"compvis_is_safe":true,"falcon_time_taken":0.14787530899047852,"compvis_time_taken":0.4556546211242676}

Performance Metrics

1.15s Prediction Time
243.84s Total Time
Input Parameters
image (required) Type: string
Input image
Output Schema
Example Execution Logs
Falcon safety check took 0.15 seconds
CompVis safety check took 0.46 seconds
Falcon output:  True
Compvis output:  True
Version Details
Version ID
684e5e879da2bcdbddc1a094650e17179d14024398d1d6146c6382ff07488fc1
Version Created
September 4, 2024
Run on Replicate →