0 like 1 dislike
in General Factchecking by Apprentice (1.0k points)
Apple abandoned a tool in 2021 that would allow them to scan Apple products pertaining child sexual abuse images and reporting it to law enforcement. The company is facing a lawsuit saying that the company sold products that claimed they would protect victims but are not doing so. There's also concern now about the ability for such illegal material to be circulated without any consequences.
ago by Newbie (220 points)
0 0
Now while you may have some good points and solid evidence the validity of this in my opinion is somewhat lacking. I think more evidence would be helpful and possibly more comparison to other software companies would make your evidence become more concrete. Overall interesting yet scary topic.
ago by (140 points)
edited ago by
0 0
The claim that Apple is being sued for abandoning its tools to alert when images of child abuse are being shared and abandoning its claim to protect victims is seemed to be very true. We know that this is true because the woman suing Apple was interviewed which she shared her story, "She claims that images of her abuse were stored on iCloud, and Appleā€™s decision allowed the material to be widely shared." By denying to use software that scans for CSAM Apple is breaking its claim to protect victims. Because of this Apples reported numbers compared to other tech companies have been significantly lower.

https://www.cnet.com/tech/services-and-software/apples-abandonment-of-icloud-csam-scanner-is-hurting-victims-lawsuit-alleges/

https://www.cnn.com/2022/12/08/tech/apple-csam-tool/index.html

1 Answer

0 like 0 dislike
ago by (160 points)
This appears to be true, the Apple company is facing lawsuits over the claim of abusive materials not being reported to authorities. Their justification being "user protection." They also have historically lower reports of abusive material than their competitors. Apple also did stop the implementation of this feature that would help report these abusive materials.

https://www.benzinga.com/24/12/42381987/apple-faces-12b-lawsuit-for-dropping-csam-detection-while-google-and-meta-set-higher-standards-for-child-safety

https://www.cnn.com/2022/12/08/tech/apple-csam-tool/index.html
True

Community Rules


Be respectful.

There is bound to be disagreement on a site about misinformation. Assume best intentions on everyone's part.

If you are new to factchecking, take some time to learn about it. "How to Factcheck" has some resources for getting started. Even if you disagree with these materials, they'll help you understand the language of this community better.

News Detective is for uncovering misinformation and rumors. This is not a general interest question-answer site for things someone could Google.

Posting

The title is the "main claim" that you're trying to factcheck.

Example:
Factcheck This: Birds don't exist

If possible, LINK TO to the place you saw the claim.

Answering

LINK TO YOUR EVIDENCE or otherwise explain the source ("I called this person, I found it in this book, etc.")

But don't just drop a link. Give an explanation, copy and paste the relevant information, etc.

News Detective is not responsible for anything anyone posts on the platform.
...