- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://literature.cafe/post/1133610
I am seeing a lot of fearmongering and misinformation regarding recent events (CSAM being posted in now closed large lemmy.world communities). I say this as someone who brought attention to this with other admins as I noticed things were federating out.
Yes, this is an issue and what has happened in regards to CSAM is deeply troubling but there are solutions and ideas being discussed and worked on as we speak. This is not just a lemmy issue but an overall internet issue that affects all forms of social media, there is no clear cut solution but most jurisdictions have some form of safe harbor policy for server operators operating in good faith.
A good analogy to think of here is if someone was to drop something illegal into your yard that is open to the public. If someone stumbled upon said items you aren’t going to be hunted down for it unless there is evidence showing you knew about the items and left them there without reporting them or selling/trading said items. If someone comes up to you and says “hey, there’s this illegal thing on your property” you report it and hand it over to the relevant authorities and potentially look at security cameras if you have any and send them over with the authorities then you’d be fine.
A similar principle exists online, specifically on platforms such as this. Obviously the FBI is going to raid whoever they want and will find reasons to if they need to, but I can tell you for near certainty they probably aren’t as concerned with a bunch of nerds hosting a (currently) niche software created by 2 communists as a pet project that gained popularity over the summer because a internet business decided to shoot itself in the foot. They are specifically out to find people who are selling, trading, and making CSAM. Those that knowingly and intentionally distribute and host such content are the ones that they are out for blood for.
I get it. This is anxiety inducing especially as an admin, but so long as you preserving and reporting any content that is brought to your attention in a timely manner and are following development and active mitigation efforts, you should be fine. If you want to know in more detail click the link above.
I am not a lawyer, and of course things vary from country to country so it’s a good idea to check from reputable sources on this matter as well.
As well, this is a topic that is distressing for most normal well adjusted people for pretty obvious reasons. I get the anxiety over this, I really do. It’s been a rough few days for many of us. But playing into other peoples anxiety over this is not helping anyone. What is helping is following and contributing the discussion of potential fixes/mitigation efforts and taking the time to calmly understand what you as an operator are responsible for within your jurisdiction.
Also, if you witnessed the content being discussed here no one will fault you for taking a step away from lemmy. Don’t sacrifice your mental health over a volunteer project, it’s seriously not worth it. Even more so if this has made you question self hosting lemmy or any other platform like it, that is valid as well as it should be made more clearer that this is a risk you are taking on when making any kind of website that is connected to the open internet.
I would also suggest reading User Generated Content and the Fediverse: A Legal Primer – https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer
Make sure to follow its advice - it isn’t automatic and you will need to take affirmative steps.
The safe harbor doesn’t apply automatically. First, the safe harbor is subject to two disqualifiers: (1) actual or “red flag” knowledge of specific infringement; and (2) profiting from infringing activity if you have the right and ability to control it. The standards for these categories are contested; if you are concerned about them, you may wish to consult a lawyer.
Second, a provider must take some affirmative steps to qualify:
Designate a DMCA agent with the Copyright Office.
This may be the best $6 you ever spend. A DMCA agent serves as an official contact for receiving copyright complaints, following the process discussed below. Note that your registration must be renewed every three years and if you fail to register an agent you may lose the safe harbor protections. You must also make the agent’s contact information available on your website, such as a link to publicly-viewable page that describes your instance and policies.
Have a clear DMCA policy, including a repeat infringer policy, and follow it.
To qualify for the safe harbors, all service providers must “adopt and reasonably implement, and inform subscribers and account holders of . . . a policy that provides for the termination in appropriate circumstances of . . . repeat infringers.” There’s no standard definition for “repeat infringer” but some services have adopted a “three strikes” policy, meaning they will terminate an account after three unchallenged claims of infringement. Given that copyright is often abused to take down lawful speech, you may want to consider a more flexible approach that gives users ample opportunity to appeal prior to termination. Courts that have examined what constitutes “reasonable implementation” of a termination process have stressed that service providers need not shoulder the burden of policing infringement.
…
And further down:
Service providers are required to report any CSAM on their servers to the CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC), a private, nonprofit organization established by the U.S. Congress, and can be criminally prosecuted for knowingly facilitating its distribution. NCMEC shares those reports with law enforcement. However, you are not required to affirmatively monitor your instance for CSAM.
but so long as you preserving and reporting any content that is brought to your attention
Say what? Preserving it?
Some countries (mainly the US. I don’t know about elsewhere.) require preservation of the files in some sort of storage that is secured access only for at least 90 days and is essentially treated like toxic waste. (usually a tiny VPS)
(h) Preservation.-
(1) In general.-For the purposes of this section, a completed submission by a provider of a report to the CyberTipline under subsection (a)(1) shall be treated as a request to preserve the contents provided in the report for 90 days after the submission to the CyberTipline.
(2) Preservation of commingled content.-Pursuant to paragraph (1), a provider shall preserve any visual depictions, data, or other digital files that are reasonably accessible and may provide context or additional information about the reported material or person.
(3) Protection of preserved materials.-A provider preserving materials under this section shall maintain the materials in a secure location and take appropriate steps to limit access by agents or employees of the service to the materials to that access necessary to comply with the requirements of this subsection.
(4) Authorities and duties not affected.-Nothing in this section shall be construed as replacing, amending, or otherwise interfering with the authorities and duties under section 2703.