Do you have a source on that? It doesn’t smell right. Every platform (All of them. Every single one. No exceptions) that allows user-submitted images/videos has an issue that some of that content is illegal. CSAM is the most obvious, but not the only one. What made Tumblr different from the 20 million+ instances on Facebook? Source1, Source2 At the time, scrolling through r/All for just a few minutes was nearly certain to show something pornographic, although not CSAM.
The story I heard (admittedly, I’m having trouble finding a source at the moment) is that Tumblr’s tools to remove CSAM weren’t good enough. While they would remove the offending image when it was reported, they did not delete the connections to other users/groups. Which meant it was easy to find more, even after some had been removed. In turn, that meant that it quickly became the platform of choice for anyone uploading this stuff, creating a higher volume and ratio of illegal content.
While I know Apple has long been anti-porn, it seems unlikely that they would take such an arbitrary hard line while ignoring countless others.
Checks to make sure my instance federates with them
Yup, thank you! I thought LemmyNSFW was our single point of failure