Measuring the enforcement gap in online content takedowns
Takedown Lab is an independent research project investigating how effectively existing legal mechanisms — DMCA notices, GDPR requests, and platform abuse reports — protect victims of non-consensual intimate imagery (NCII) and AI-generated deepfake content. We focus on East Asian platforms where enforcement data is scarce and victim protections remain weak.
Project
This project examines a critical gap in copyright enforcement: the disparity between Google Search de-indexing — where a URL is removed from search results after a DMCA notice — and actual source-level content removal, where the content itself is deleted from the hosting server. Our research suggests that de-indexing alone provides incomplete protection, as infringing content often remains fully accessible at its original URL.
Takedown Lab maintains no commercial affiliation. All findings are published as open-access resources.
Researcher
Minseong Kim is an independent cybersecurity researcher with professional experience in network security. Over the past two years, his work has focused on NCII detection, tracking, and removal enforcement across East Asian platforms. He has monitored 65 platforms hosting NCII content, documented over 420,000 detection events, and facilitated more than 1,400 content removals with a 95.3% verified success rate. His enforcement work spans multiple jurisdictions including DMCA (US), GDPR (EU), and KISO (South Korea), and involves OSINT-based infrastructure analysis of hosting providers, CDN networks, and domain registrars.
Previously published: 34 articles on NCII policy, DMCA enforcement gaps, and platform accountability.
De-indexed but Not Deleted: Measuring the Enforcement Gap in DMCA Takedown Notices for Creator Platform Content
Despite active DMCA filing by rights holders, a significant proportion of reported URLs remain fully accessible at their source even after successful Google Search de-indexing. This paper quantifies the enforcement gap across 65 platforms, analyzes removal success rates by jurisdiction and reporting mechanism, and examines why certain hosting infrastructures resist removal despite repeated legal notices. The dataset comprises 1,400+ removal cases and 420,000+ detection events collected over 18 months of continuous monitoring.
Contents
- Executive Summary
- Background: DMCA §512, De-indexing vs. Source Removal
- The East Asian NCII Landscape
- Methodology: Data Collection & URL Survival Analysis
- Findings: Quantifying the Enforcement Gap
- Platform & CDN Response Analysis
- Case Studies
- Policy Implications & the Take It Down Act
- Limitations & Future Work
The completed paper will be released as an open-access PDF on this site. We are exploring submission to policy venues including TrustCon and RightsCon. Summary findings will also be published as educational content for public awareness.
Dataset overview
The following data is derived from an independent monitoring infrastructure tracking NCII and deepfake content across East Asian platforms. Continuous collection began November 2024.
| Metric | Value | Notes |
|---|---|---|
| Detection events | 420,700+ | Cumulative since Nov 2024 |
| Platforms monitored | 65 | 24/7 automated crawling |
| Removal cases completed | 1,417 | Verified URL removal |
| Removal success rate | 95.3% | Source-level confirmed |
| Individuals under monitoring | 316 | Active protection |
Platform detection summary
Detection events by platform, ranked by volume. Domain names partially redacted to prevent direct linking.
| # | Platform | Detections | Images | Artists |
|---|---|---|---|---|
| 1 | r***deepfakes.com | 93,351 | 36,878 | 309 |
| 2 | x***fade.com | 81,817 | 29,262 | 307 |
| 3 | d***fades.com | 54,638 | 20,447 | 312 |
| 4 | k***fakes.com | 29,980 | 6,890 | 290 |
| 5 | k***fakes.com | 29,553 | 7,582 | 284 |
| 6 | d***porn.site | 24,610 | 9,374 | 301 |
| 7 | t***deep.com | 23,696 | 9,867 | 302 |
| 8 | d***celeb.com | 23,610 | 8,916 | 306 |
| 9 | a***fakes.com | 15,955 | 4,999 | 300 |
| 10 | f***bin.com | 9,514 | 4,657 | 275 |
| 11 | d***kpop.com | 8,480 | 2,678 | 236 |
| 12 | i***fake.org | 6,573 | 2,006 | 173 |
| 13 | i***fap.com | 5,048 | 1,645 | 156 |
| 14 | k***e.club | 3,866 | 1,242 | 221 |
| 15 | p***idol.org | 3,092 | 1,552 | 166 |
| + 20 additional platforms | 34,933 | 14,456 | — | |
Sample enforcement cases
Anonymized data from completed removal campaigns. Case identifiers randomized, personal information redacted. Duration measured from first detection to last confirmed removal.
| Case | URLs | Sites | Removed | Rate | Duration | Methods |
|---|---|---|---|---|---|---|
| A-0037 | 33 | 18 | 32 | 97.0% | 4.9d | CSAM, NCII |
| A-0041 | 127 | 23 | 118 | 92.9% | 8.2d | CSAM |
| A-0058 | 344 | 51 | 275 | 79.9% | 29.8d | CSAM, DMCA |
| A-0063 | 89 | 14 | 86 | 96.6% | 6.1d | CSAM |
| A-0071 | 52 | 9 | 49 | 94.2% | 3.7d | DMCA, NCII |
| A-0084 | 213 | 31 | 194 | 91.1% | 14.3d | CSAM, DMCA |
| A-0092 | 41 | 7 | 41 | 100% | 2.4d | NCII |
| A-0105 | 168 | 28 | 152 | 90.5% | 11.6d | CSAM |
Methodology
The study uses a multi-phase approach. Phase 1 involves continuous automated monitoring of 65 platforms using AI-based content fingerprinting, with each detection logged by URL, timestamp, and platform. Phase 2 submits removal requests through applicable channels (DMCA §512, GDPR erasure, KISO, direct platform abuse reports) and tracks each submission. Phase 3 performs URL survival analysis — re-checking source URLs at 7, 30, 60, and 90-day intervals after de-indexing to measure whether content remains accessible at origin. A planned Phase 4 involves cross-referencing internal data with the Lumen Database at scale to analyze enforcement patterns across reporting organizations.
Jurisdictions
Removal requests are filed under DMCA (US-hosted and US-accessible platforms), GDPR (EU/UK-hosted platforms), and KISO (Korean domestic platforms and services coordinated through the Korea Internet Self-governance Organization). The choice of mechanism depends on hosting location, registrar jurisdiction, and CDN provider.
For research inquiries, data collaboration, or media requests:
We welcome collaboration with researchers, journalists, and policy makers working on NCII enforcement, platform accountability, or digital rights.