Takedown Lab

Measuring the enforcement gap in online content takedowns

Takedown Lab is an independent research project investigating how effectively existing legal mechanisms — DMCA notices, GDPR requests, and platform abuse reports — protect victims of non-consensual intimate imagery (NCII) and AI-generated deepfake content. We focus on East Asian platforms where enforcement data is scarce and victim protections remain weak.

About

Project

This project examines a critical gap in copyright enforcement: the disparity between Google Search de-indexing — where a URL is removed from search results after a DMCA notice — and actual source-level content removal, where the content itself is deleted from the hosting server. Our research suggests that de-indexing alone provides incomplete protection, as infringing content often remains fully accessible at its original URL.

Takedown Lab maintains no commercial affiliation. All findings are published as open-access resources.

Researcher

Minseong Kim is an independent cybersecurity researcher with professional experience in network security. Over the past two years, his work has focused on NCII detection, tracking, and removal enforcement across East Asian platforms. He has monitored 65 platforms hosting NCII content, documented over 420,000 detection events, and facilitated more than 1,400 content removals with a 95.3% verified success rate. His enforcement work spans multiple jurisdictions including DMCA (US), GDPR (EU), and KISO (South Korea), and involves OSINT-based infrastructure analysis of hosting providers, CDN networks, and domain registrars.

Previously published: 34 articles on NCII policy, DMCA enforcement gaps, and platform accountability.

Research
Working Paper · In Progress

De-indexed but Not Deleted: Measuring the Enforcement Gap in DMCA Takedown Notices for Creator Platform Content

Minseong Kim · Takedown Lab · Expected Q3 2026

Despite active DMCA filing by rights holders, a significant proportion of reported URLs remain fully accessible at their source even after successful Google Search de-indexing. This paper quantifies the enforcement gap across 65 platforms, analyzes removal success rates by jurisdiction and reporting mechanism, and examines why certain hosting infrastructures resist removal despite repeated legal notices. The dataset comprises 1,400+ removal cases and 420,000+ detection events collected over 18 months of continuous monitoring.

Contents

  1. Executive Summary
  2. Background: DMCA §512, De-indexing vs. Source Removal
  3. The East Asian NCII Landscape
  4. Methodology: Data Collection & URL Survival Analysis
  5. Findings: Quantifying the Enforcement Gap
  6. Platform & CDN Response Analysis
  7. Case Studies
  8. Policy Implications & the Take It Down Act
  9. Limitations & Future Work

The completed paper will be released as an open-access PDF on this site. We are exploring submission to policy venues including TrustCon and RightsCon. Summary findings will also be published as educational content for public awareness.

Data

Dataset overview

The following data is derived from an independent monitoring infrastructure tracking NCII and deepfake content across East Asian platforms. Continuous collection began November 2024.

MetricValueNotes
Detection events420,700+Cumulative since Nov 2024
Platforms monitored6524/7 automated crawling
Removal cases completed1,417Verified URL removal
Removal success rate95.3%Source-level confirmed
Individuals under monitoring316Active protection

Platform detection summary

Detection events by platform, ranked by volume. Domain names partially redacted to prevent direct linking.

#PlatformDetectionsImagesArtists
1r***deepfakes.com93,35136,878309
2x***fade.com81,81729,262307
3d***fades.com54,63820,447312
4k***fakes.com29,9806,890290
5k***fakes.com29,5537,582284
6d***porn.site24,6109,374301
7t***deep.com23,6969,867302
8d***celeb.com23,6108,916306
9a***fakes.com15,9554,999300
10f***bin.com9,5144,657275
11d***kpop.com8,4802,678236
12i***fake.org6,5732,006173
13i***fap.com5,0481,645156
14k***e.club3,8661,242221
15p***idol.org3,0921,552166
+ 20 additional platforms34,93314,456

Sample enforcement cases

Anonymized data from completed removal campaigns. Case identifiers randomized, personal information redacted. Duration measured from first detection to last confirmed removal.

CaseURLsSitesRemovedRateDurationMethods
A-003733183297.0%4.9dCSAM, NCII
A-00411272311892.9%8.2dCSAM
A-00583445127579.9%29.8dCSAM, DMCA
A-006389148696.6%6.1dCSAM
A-00715294994.2%3.7dDMCA, NCII
A-00842133119491.1%14.3dCSAM, DMCA
A-009241741100%2.4dNCII
A-01051682815290.5%11.6dCSAM

Methodology

The study uses a multi-phase approach. Phase 1 involves continuous automated monitoring of 65 platforms using AI-based content fingerprinting, with each detection logged by URL, timestamp, and platform. Phase 2 submits removal requests through applicable channels (DMCA §512, GDPR erasure, KISO, direct platform abuse reports) and tracks each submission. Phase 3 performs URL survival analysis — re-checking source URLs at 7, 30, 60, and 90-day intervals after de-indexing to measure whether content remains accessible at origin. A planned Phase 4 involves cross-referencing internal data with the Lumen Database at scale to analyze enforcement patterns across reporting organizations.

Jurisdictions

Removal requests are filed under DMCA (US-hosted and US-accessible platforms), GDPR (EU/UK-hosted platforms), and KISO (Korean domestic platforms and services coordinated through the Korea Internet Self-governance Organization). The choice of mechanism depends on hosting location, registrar jurisdiction, and CDN provider.

Contact

For research inquiries, data collaboration, or media requests:

[email protected]

We welcome collaboration with researchers, journalists, and policy makers working on NCII enforcement, platform accountability, or digital rights.