Twitter using Section 230 immunity as defense in child-porn case

A lawsuit in federal court in California alleges Twitter allowed child-porn videos and images to remain on its platform after receiving complaints, because the posts didn’t “violate policies.”

The social-media behemoth is trying to get out of the case entirely by relying on the immunity from libel litigation it is afforded from Section 230 of the federal Communications Decency Act, reports the Red State Nation blog.

The law grants social-media platforms immunity because they are regarded as neutral platforms for the flow of information rather than publishers. However, many Congress members argue, Twitter essentially has acted as an editor by censoring or flagging as “false” certain controversial viewpoints.

Twitter allegedly refused requests to remove the pornograhic content, including of a 13-year-old who was tricked into sharing explicit videos and then blackmailed with them. The videos eventually were removed after an agent of the Department of Homeland Security intervened.

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center of Missing and Exploited Children,” the lawsuit said.

The New York Post reported the complaint alleges Twitter knowingly hosts people who use the platform to exchange child-porn material and profits from it through advertising.

The complaint in the Northern District of California was filed by the victim, whose name was not revealed, and his mother.

But now the Red State Nation report includes images of court filings in which Twitter claims it is protected from lawsuits under “CDA 230 immunity.”

The plaintiff, Twitter charges in the document, “does not adequately plead that Twitter knowingly and affirmatively participated in any sex trafficking venture.”

Twitter’s motion to dismiss argued Section 230 gives platforms immunity for the failure to remove “offensive third-party content.”

The company conceded that there was damage but denies any responsibility.

“Plaintiff John Doe appears to have suffered appallingly at the hands of unknown individuals, who tricked and manipulated him into making and sharing explicit pictures and videos of himself and another individual in 2017, when he was a minor. But this case ultimately does not seek to hold those perpetrators accountable for the suffering they inflicted on plaintiff,” Twitter said.

“Rather, this case seeks to hold Twitter liable because a compilation of that explicit video content was … years later – posted by others on Twitter’s platform and although Twitter did remove the content, it allegedly did not act quickly enough.”

Twitter claimed its “mistakes or delays” don’t make it “a knowing particpant in a sex trafficking venture.”

The company claimed it has “zero tolerance” for such material but admitted it had to take action against nearly 440,000 accounts in only six months of 2020.

And the company contended “it is simply not possible” to remove all offensive content immediately.

Congress gave it “broad immunity from legal claims arising out of failure to remove content.”

Twitter claimed that an exception to the immunity, for sites that post sex-trafficking material, doesn’t apply, because Twitter didn’t know the boy “was a victim of sex trafficking.”

The Red State Nation report noted the videos were reported to Twitter “at least three times,” but the company declined to take action until federal agents were involved.

The victim was 13 and 14 when he “was manipulated” into sending images to human traffickers “who pretended to be a 16-year-old girl who went to his school.”

He then was blackmailed into sending more.

Eventually, he blocked the traffickers, but in 2019, the videos reappeared on two accounts “known to share child sexual abuse material.”

The motion is posted here.

Twitter’s response to one complaint about the videos was: “Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”

The company then recommended reviewing the possibility of a “copyright infringement.”

The teen told the company: “What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down.”

Content created by the WND News Center is available for re-publication without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact licensing@wndnewscenter.org.

SUPPORT TRUTHFUL JOURNALISM. MAKE A DONATION TO THE NONPROFIT WND NEWS CENTER. THANK YOU!

The post Twitter using Section 230 immunity as defense in child-porn case appeared first on WND.

Source: World Net Daily

Related Articles

Responses

Scroll Up