Enhance Content Moderation with NSFW Image Filtering

27 Apr 2025
Enhance Content Moderation with NSFW Image Filtering

In today's digital landscape, ensuring a safe and respectful online environment is crucial, especially when it comes to user-generated content. The Nsfw Filter provides developers with powerful Cognitive Actions designed to filter out Not Safe For Work (NSFW) content from images. By leveraging advanced algorithms, this filter effectively analyzes images to detect and block inappropriate content, enhancing content moderation efforts with ease and speed.

Common use cases for the Nsfw Filter include social media platforms, online forums, and any application where user-uploaded images are prevalent. By integrating this filter, developers can protect their users from exposure to explicit content, ensuring a more secure and family-friendly experience.

Prerequisites

To utilize the Nsfw Filter, you'll need a Cognitive Actions API key and a basic understanding of API calls to seamlessly integrate this functionality into your applications.

Filter NSFW Content in Images

The "Filter NSFW Content in Images" action allows developers to analyze images for inappropriate content using the Stable Diffusion content filter. This action is essential for maintaining community standards and ensuring that users interact in a safe environment.

Input Requirements

The action requires a single input: an image URL pointing to the image that will be processed. The URL must be in a valid URI format, such as:

{
  "image": "https://replicate.delivery/pbxt/HopNzJn8F7MbMq47EpzTOubJgxT4N4DWZwb5dWURi885Zm6C/kid.jpeg"
}

Expected Output

The output will indicate whether NSFW content was detected. It includes an array of detected NSFW content, any special tags associated with the image, and a boolean indicating the presence of NSFW content. For example:

{
  "nsfw": [],
  "special": [
    "little girl",
    "young child",
    "young girl"
  ],
  "nsfw_detected": false
}

Use Cases for this specific action

This action is particularly useful for platforms that allow users to upload images, such as social media sites, gaming forums, or any community-driven application. By implementing the NSFW filter, developers can automatically screen images before they are displayed, significantly reducing the risk of inappropriate content being shared. This not only protects users but also builds trust and credibility for the platform.

import requests
import json

# Replace with your actual Cognitive Actions API key and endpoint
# Ensure your environment securely handles the API key
COGNITIVE_ACTIONS_API_KEY = "YOUR_COGNITIVE_ACTIONS_API_KEY"
# This endpoint URL is hypothetical and should be documented for users
COGNITIVE_ACTIONS_EXECUTE_URL = "https://api.cognitiveactions.com/actions/execute"

action_id = "79be82f7-d67f-44e8-9496-d644c006303c" # Action ID for: Filter NSFW Content in Images

# Construct the exact input payload based on the action's requirements
# This example uses the predefined example_input for this action:
payload = {
  "image": "https://replicate.delivery/pbxt/HopNzJn8F7MbMq47EpzTOubJgxT4N4DWZwb5dWURi885Zm6C/kid.jpeg"
}

headers = {
    "Authorization": f"Bearer {COGNITIVE_ACTIONS_API_KEY}",
    "Content-Type": "application/json",
    # Add any other required headers for the Cognitive Actions API
}

# Prepare the request body for the hypothetical execution endpoint
request_body = {
    "action_id": action_id,
    "inputs": payload
}

print(f"--- Calling Cognitive Action: {action.name or action_id} ---")
print(f"Endpoint: {COGNITIVE_ACTIONS_EXECUTE_URL}")
print(f"Action ID: {action_id}")
print("Payload being sent:")
print(json.dumps(request_body, indent=2))
print("------------------------------------------------")

try:
    response = requests.post(
        COGNITIVE_ACTIONS_EXECUTE_URL,
        headers=headers,
        json=request_body
    )
    response.raise_for_status() # Raise an exception for bad status codes (4xx or 5xx)

    result = response.json()
    print("Action executed successfully. Result:")
    print(json.dumps(result, indent=2))

except requests.exceptions.RequestException as e:
    print(f"Error executing action {action_id}: {e}")
    if e.response is not None:
        print(f"Response status: {e.response.status_code}")
        try:
            print(f"Response body: {e.response.json()}")
        except json.JSONDecodeError:
            print(f"Response body (non-JSON): {e.response.text}")
    print("------------------------------------------------")

Conclusion

The Nsfw Filter offers a robust solution for developers looking to enhance their content moderation capabilities. By filtering NSFW content in images, you can create a safer online space for users, thereby improving user experience and satisfaction. Start integrating the Nsfw Filter today to ensure your platform remains a welcoming environment for all.