A powerful TypeScript library for detecting images and videos not suitable for public advertising using Claude AI. Perfect for media owners, servers, DSPs, and performance advertising.
Keep your advertising platform safe with intelligent content detection
Uses Claude AI to intelligently detect inappropriate content in images and videos
Quick analysis with video screenshot extraction every 0.5 seconds for efficient processing
Simple TypeScript library with comprehensive type definitions and clear API
Get started in minutes with our intuitive API
npm install ad-moderator
import { AdModeratorClient } from 'ad-moderator';
const client = new AdModeratorClient(
'your-anthropic-api-key'
);
// Check image compliance
const imageBuffer = Buffer.from('your-image-data');
const result = await client.getAdStatus(imageBuffer);
if (result?.isAdCompliant) {
console.log('Image is safe for advertising');
} else {
console.log('Reasons:', result?.negativeReasons);
}
// Check video compliance
const videoBuffer = Buffer.from('your-video-data');
const videoResult = await client.getVideoAdStatus(videoBuffer);
Analyze images for inappropriate content with detailed compliance reports
Extract screenshots and analyze video content for advertising compliance
Define custom moderation criteria specific to your platform requirements
Visual examples of how Ad Moderator works
Real-time moderation results in your advertising platform with AI-powered content detection
Clean API with comprehensive TypeScript support and easy integration
Ideal for various advertising and content platforms
Join thousands of developers using Ad Moderator to keep their platforms safe