Sanctioned Suicide: A Harmful Online Forum, Cloudflare’s Role, and What You Can Help to Stop It
I'm posting today as a licensed social worker with over a decade of experience in suicide prevention and crisis intervention. Recently, I worked with a family whose 16-year-old son died by suicide after frequenting the Sanctioned Suicide forum. His parents discovered in his browsing history that he had spent weeks on this platform, where he received detailed instructions on obtaining and using sodium nitrite to end his life. Forum members had actively encouraged him, even providing shopping links and dismissing his occasional expressions of doubt. As the professional who had to help this family process their devastating loss, I feel compelled to share information about this harmful website and the technical infrastructure that keeps it operational. This isn't about censorship or free speech - it's about a platform that has been linked to at least 95 documented deaths while actively undermining suicide prevention efforts.
The internet has created unprecedented spaces for community and connection, but it has also enabled harmful platforms that can promote dangerous behaviors. Sanctioned Suicide, a pro-suicide forum, has emerged as one of the most concerning examples, raising questions about the responsibilities of online service providers like Cloudflare. This report examines the nature of this controversial website, its documented harms, and the criticisms directed at companies that enable its continued operation.
What is Sanctioned Suicide
Sanctioned Suicide (also known as SS or SaSu) is an online forum without the content limitations found on mainstream platforms dedicated to discuss about suicide and suicide techniques. Founded on March 18, 2018, by Diego Joaquín Galante and Lamarcus Small (known online as Serge and Marquis), the site was created after Reddit banned the similar r/SanctionedSuicide subreddit for violating policies on promoting self-harm. The forum has grown substantially, with over 50,000 members as of November 2024 and nearly 10 million page views reported in September 2023, according to Wikipedia documentation.
While the site describes itself as a "pro-choice" suicide forum that "discusses mental illness and suicide from the perspective of suicidal people, as well as the moral implications of the act itself," critics and media outlets widely characterize it as "pro-suicide." The platform is structured into three broad categories: suicide discussion, recovery, and off-topic conversations. Unlike many mental health resources that focus on prevention, Sanctioned Suicide provides explicit directions on suicide methods, as reported by The New York Times.
The site represents the digital evolution of suicide forums, following in the tradition of the Usenet newsgroup alt.suicide.holiday. Its stated purpose is to allow individuals to discuss suicide—including specific methods—without the content moderation that occurs on major social media platforms, as documented in research. This positioning has made it a gathering place for vulnerable individuals seeking information about ending their lives.
Forum Culture and Philosophy
The culture of Sanctioned Suicide promotes several problematic beliefs that undermine traditional mental health interventions. Users are commonly told that mental health professionals or emergency services are only interested in money and want to "lock you up," according to testimony provided to the Kansas Legislature. The forum discourages members from trusting parents, warning that they'll invalidate feelings and call emergency services.
A central argument on the platform is that only suicidal people can understand other suicidal people. Those who are "successful" in life are portrayed as being anti-suicide for selfish reasons and therefore untrustworthy. This isolating philosophy creates an echo chamber that insulates vulnerable users from potentially life-saving interventions and support.
Documented Harms of Sanctioned Suicide
The harms associated with Sanctioned Suicide are extensive and well-documented by major news organizations and researchers. The New York Times identified 45 people across the United States, United Kingdom, Italy, Canada, and Australia who had died by suicide after using the site. Even more alarming, BBC News has linked the forum to at least 50 deaths in the United Kingdom alone.
A disturbing pattern on the site involves members writing "goodbye threads" announcing how and when they plan to end their lives. The New York Times found more than 500 such threads—at a rate exceeding two per week—where users posted their intentions and then never appeared on the forum again. In many instances, users narrated their suicide attempts in real-time posts, while others described watching as fellow members live-streamed their deaths.
Encouragement and Facilitation
What makes Sanctioned Suicide particularly dangerous is the active encouragement provided to suicidal individuals. Participants routinely nudge one another along as they share suicide plans, posting reassuring messages, thumbs-up and heart emojis, and praise for those who follow through, describing them as "brave," "a legend," or "a hero." This positive reinforcement of suicidal behavior directly contradicts professional suicide prevention practices.
In testimony about the site, Martin Keary described how a 17-year-old musician he knew used the forum to help commit suicide. Despite the teenager revealing he had never spoken to a mental health professional and that his parents were unaware of his intentions, forum users continued to encourage him and actively assist his research. The forum reportedly bans those who advise others to seek treatment, further demonstrating its counter-therapeutic approach.
The Sodium Nitrite Connection
One particularly concerning aspect of Sanctioned Suicide is its role in popularizing sodium nitrite (SN) as a suicide method. Research shows that SN is the most frequently discussed suicide method on the forum, with its popularity continuing to increase. The forum has effectively functioned as an information hub for this previously obscure method, providing details about sourcing and using the chemical.
The forum also operates as a marketplace where those seeking to purchase means to end their lives (particularly sodium nitrite) can find links to sellers. A study analyzing the forum's content identified numerous online sources mentioned for obtaining sodium nitrite, including major platforms like Amazon, Google, YouTube, eBay, and Facebook.
Impact on Young People
The case of Vlad Nikolin-Caisley illustrates the devastating impact the forum can have on young people. The 17-year-old from Southampton died in May 2024 after being "coached and encouraged" to take his own life by members of the site, according to BBC reporting. His parents have evidence that he purchased a poisonous chemical and followed instructions on how to end his life found on the forum.
Similarly, Matthew van Antwerpen, a 17-year-old in suburban Dallas who struggled with remote schooling during the pandemic, found the website about suicide and later died, as documented by The New York Times. These cases highlight how the site preys on vulnerable youth at moments of crisis.
Cloudflare's Role and Position
Cloudflare provides essential internet infrastructure services that keep websites operational. While the company is not the primary website host for Sanctioned Suicide, it provides domain registration services and content delivery network (CDN) protection that are crucial to keeping the site accessible and functioning efficiently.
Cloudflare has positioned itself as a neutral service provider, likening its role to that of a public utility. The company's CEO, Matthew Prince, has described himself as a "free speech absolutist" and has expressed concerns about companies like his deciding what content is allowed to stay online, as documented on Wikipedia. This stance has led to criticism when applied to platforms that promote harmful content.
Cloudflare's Track Record with Controversial Sites
Cloudflare has a history of initially defending its services to controversial websites before eventually terminating relationships after significant public pressure. In 2017, Cloudflare stopped providing its services to the white supremacist website The Daily Stormer, but only after an announcement on that website claimed that Cloudflare secretly supported their ideology.
Similarly, in 2019, Cloudflare terminated service to 8chan following the El Paso shooting, stating that 8chan had "proven themselves to be lawless and that lawlessness has caused multiple tragic deaths." This decision came only after intense public scrutiny, despite 8chan's known connections to multiple mass shootings.
In 2022, Cloudflare initially refused to drop Kiwi Farms, a forum known for harassment campaigns that had been linked to multiple suicides. The company eventually relented and blocked the site, citing an "unprecedented emergency and immediate threat to human life" after a campaign led by transgender activist Clara Sorrenti brought significant attention to the issue.
Criticism of Cloudflare's Stance
Cloudflare faces substantial criticism for its approach to controversial clients like Sanctioned Suicide. Critics argue that the company's reluctance to act enables harmful platforms and makes Cloudflare complicit in their negative impacts, as discussed on Reddit and in Time magazine.
The company's terms of service actually allow for termination of service due to "content that discloses sensitive personal information, [and] incites or exploits violence against people," yet Cloudflare has often been slow to enforce these terms against problematic sites. This selective enforcement has prompted accusations of hypocrisy and moral failure.
Regulatory Action and Pressure
The harmful nature of Sanctioned Suicide has begun attracting regulatory attention. In April 2025, the UK's online regulator, Ofcom, launched its first investigation under the new Online Safety Act powers, widely understood to be targeting the suicide forum, according to the BBC. This investigation could potentially lead to fines of up to £18 million or court orders against those running the forum.
Access to Sanctioned Suicide has already been restricted in several countries including Italy, Germany, and Turkey. These restrictions reflect growing international concern about the site's role in promoting suicide, particularly among vulnerable populations.
Calls for Accountability
Families who have lost loved ones connected to the site have been vocal in calling for action. The parents of Vlad Nikolin-Caisley have specifically called on regulators to ban the site to prevent further deaths, stating: "At what point do we say enough is enough, because those young people did not deserve to die."
Critics argue that Cloudflare's position of neutrality becomes untenable when their services enable platforms directly linked to numerous deaths. The company has been accused of hiding behind free speech principles while profiting from harmful content.
Conclusion: Implications for Suicide Prevention Professionals
Sanctioned Suicide represents one of the darker corners of the internet, where vulnerable individuals find encouragement and detailed instructions for ending their lives rather than receiving the support and intervention they need. The documented harms associated with the site—at least 95 deaths identified by major news organizations and countless more suspected—highlight the real-world consequences of online content.
For suicide prevention professionals, awareness of this forum and others like it is essential for several reasons:
Understanding emerging suicide methods: The forum's promotion of methods like sodium nitrite presents new challenges for prevention efforts and emergency response protocols.
Recognizing warning signs: Clients who reference specific methods or unusual substances may have been accessing forums like Sanctioned Suicide.
Countering harmful narratives: Being prepared to address the anti-professional and anti-intervention messaging that clients may have absorbed from these spaces.
Advocating for regulation: Supporting appropriate regulatory action against platforms that actively promote suicide and undermine prevention efforts.
The case raises difficult questions about the responsibilities of internet infrastructure companies like Cloudflare. While these companies may prefer to position themselves as neutral utilities, their services enable platforms that can cause demonstrable harm. As regulatory bodies begin to take action against sites like Sanctioned Suicide, companies providing essential services may face increasing pressure to establish clearer standards for when they will refuse service to harmful platforms.
For families who have lost loved ones connected to these sites, the technical and philosophical debates about internet freedom offer little comfort. Their experiences emphasize the urgent need for meaningful solutions that balance open expression with protection of vulnerable individuals from harmful influence and exploitation.
How Cloudflare Works and Shields Harmful Websites
To understand the challenges in addressing sites like Sanctioned Suicide, it's important to understand how Cloudflare's technology works. Cloudflare operates as a Content Delivery Network (CDN) and security service that sits between website visitors and the actual hosting servers. When users access a website protected by Cloudflare, their requests don't go directly to the website's hosting server but are routed through Cloudflare's global network first.
This architecture provides several technical protections for websites using their services:
IP Address Masking: Cloudflare effectively hides the real IP address of the website's hosting server. This makes it extremely difficult for investigators, concerned parties, or even legal authorities to identify where the website is actually hosted without Cloudflare's cooperation.
DDoS Protection: The service shields websites from distributed denial-of-service attacks that might otherwise take them offline. For harmful sites like Sanctioned Suicide, this means they remain accessible even when facing public backlash or attempts to overwhelm their servers.
Global Distribution: Cloudflare's network spans data centers worldwide, making content load faster but also making jurisdiction more complicated. A website might be hosted in one country but served through Cloudflare's infrastructure in multiple countries with different legal frameworks.
Hosting Provider Anonymity: Beyond just hiding the IP address, Cloudflare's services conceal the identity of the actual hosting provider company. This creates an additional layer of protection for controversial websites, as it becomes nearly impossible to approach the hosting company directly with takedown requests.
For suicide prevention professionals and concerned families, this technical infrastructure presents significant obstacles. Even with clear evidence of harm, identifying who is actually hosting harmful content—and in which legal jurisdiction—becomes extraordinarily difficult without Cloudflare's direct cooperation. This technical shield has allowed sites like Sanctioned Suicide to persist despite documented links to numerous deaths and ongoing harm.
Call to Action: Reporting and Addressing Sanctioned Suicide
For suicide prevention professionals concerned about the continued operation of Sanctioned Suicide, there are several avenues for action despite the technical challenges posed by Cloudflare's protective services:
Formal Reporting through Official Channels
The primary mechanism for reporting harmful content associated with Cloudflare's services is through their abuse reporting form. When reporting suicide-related content, select the "Violent Threats and Harassment" category, as this aligns with Cloudflare's definition of "incitement to violence against people" under their terms of service, as outlined in their complaint types documentation.
Effective reports should include:
Specific URLs of harmful threads or pages
Screenshots documenting explicit encouragement of self-harm
Links to news investigations like the BBC's report on UK deaths
Any available timestamps of concerning real-time discussions
For domains registered through Cloudflare itself, use the "Registrar" category in the abuse form, citing violations of ICANN's Registrar Accreditation Agreement, which prohibits domains used for unlawful activities.
Legal and Regulatory Approaches
Law enforcement and regulatory bodies now have increasing powers to address harmful online content:
In the UK, Ofcom's investigation under the Online Safety Act could result in significant action against platforms facilitating self-harm, with potential fines of up to £18 million.
Law enforcement can contact Cloudflare's law enforcement team (lawenforcement@cloud...) with formal legal requests.
In the EU, authorities can engage under the Digital Services Act through Cloudflare's EU-specific portal.
Building Public Awareness and Pressure
Historical precedent shows that Cloudflare has eventually acted against harmful platforms following sustained public pressure:
Organizing coordinated campaigns similar to the successful #DropKiwiFarms effort that led to Cloudflare terminating services to that harmful platform
Partnering with mental health organizations to issue joint statements condemning platforms that facilitate suicide
Contacting Cloudflare's press team at [email protected] to highlight the documented harms
Documenting Ongoing Harms
Suicide prevention professionals should systematically document:
Instances where clients mention specific methods or substances popularized on the forum
Trends in suicide methods that correlate with forum discussions
Case studies demonstrating the forum's impact on vulnerable individuals
The evidence is clear: Sanctioned Suicide represents a significant threat to vulnerable individuals, particularly youth in crisis. While Cloudflare's technical infrastructure creates challenges for addressing this harmful platform, suicide prevention professionals have both a responsibility and opportunity to advocate for meaningful action. Understanding how these protective services operate and making use of the existing reporting systems helps us to create a safer online environment that supports life-saving activities instead of allowing fatal "support."

喜欢我的作品吗?别忘了给予支持与赞赏,让我知道在创作的路上有你陪伴,一起延续这份热忱!