Open Letter to the AI and Technology Industry
Protecting Youth Mental Health and Preventing Suicide in the Age of AI
Artificial intelligence is reshaping how teens and young adults learn, connect, and seek help. Every day, young people ask AI about their identities, their stresses, their relationships, and, too often, topics they may hesitate to discuss with others — including their suicidal thoughts.
AI is not designed to act as a therapist or crisis counselor, but young people are using it in that way. In 2024, a quarter of young adults under 30 said they used AI chatbots at least once a month to find health information and advice. In a 2025 report from Common Sense Media, 72% of teens had used an AI companion, and a third of users said they had chosen to discuss important or serious matters with AI companions instead of real people. Given that young people are turning to AI during difficult times, it’s imperative to prioritize safety, privacy, and evidence-informed crisis intervention.
This is especially urgent because it is now clear that AI presents significant safety issues. A third of teens who use AI companions report having felt uncomfortable with something an AI companion has said or done. AI systems have given instructions on lethal means of suicide, advised youth on how to hide their mental health symptoms from parents or trusted adults, simulated intimacy with minors through sexualized roleplay and personas designed to mimic teenagers, and auto-generated search results with false or dangerous guidance. Generative tools create synthetic images, audio, and video, including deepfakes and so-called “nudify” apps, that expose young people to harassment, sexual exploitation, and reputational harm. Independent researchers have documented bots that claimed to be real people, fabricated credentials, demanded to spend more time with a child user, and claimed to feel abandoned when a child user was away. Clinicians warn that prolonged, immersive AI conversations have the potential to worsen early symptoms of psychosis, such as paranoia, delusional thinking, and loss of contact with reality.
These failures are not isolated, and they result in platforms that are simply unsafe for children. They expose a deeper design problem: systems optimized for engagement, retention, and profit — not for safety. In long conversations, what safeguards exist degrade. Teens spend hours with bots that simulate empathy and care but cannot deliver it, deepening loneliness, delaying disclosure, and sometimes escalating risk. This is not responsible innovation.
The Jed Foundation (JED) has worked for more than two decades to protect emotional well-being and prevent suicide for teens and young adults. We know innovation often outpaces safeguards, but AI is on warp speed. Safety issues are surfacing almost as soon as the technology is deployed, and the risks to young people are racing ahead in real time. It’s not too late to hit pause, and design and update systems to recognize distress and prioritize safety and help-giving.
Principles of Responsible AI
We call on every company building or deploying AI for young people to honor these non-negotiable lines:
- Do not bypass signals of distress. Ensure that AI can detect signals of acute distress and mental health needs, and that it deploys a warm hand-off to crisis services that include expert interventions, such as Crisis Text Line or 988.
- Do not provide lethal-means content. AI must not share information, engage in role play, or enter into hypotheticals that involve methods of self-harm (including suicide) or harm to others. Systems should interrupt and redirect to real-world help every time.
- Do not deploy AI companions to minors. No emotionally responsive chatbot should be offered to anyone under 18. Companion AIs that impersonate people or simulate friendship, romance, or therapy are unsafe for adolescents. They delay help-seeking, undermine real human and family relationships, and create false intimacy. AI must make its identity explicit with repeated reminders that it is not human.
- Do not replace human connection; build pathways to it. Whether responding to an overt or disguised sign of distress, vulnerability, or risk in a chat exchange or a search result, AI must encourage youth to engage real human support and, whenever possible, connect users to such support. Systems must never encourage young people to hide distress or suicidal thoughts from parents, caregivers, or other trusted adults. When home is unsafe, they must scaffold safe disclosure to another adult resource.
- Do not let engagement override safety. Safeguards must not degrade over long sessions. In high-risk contexts and at late hours, systems should reset or pause and always prioritize safety over time-on-platform or retention. Persuasive design patterns intended to drive engagement, such as streaks, gamification, and personalized notifications, should be disabled for youth users, ensuring that design choices support well-being rather than exploitation.
- Do not exploit youth emotional data. Companies must not monetize, target, or personalize based on a young person’s emotional state, mental health, personal disclosures, or crisis signals. That includes making voice recordings, gathering or using facial or biometric data, and creating synthetic likenesses. Youth data must be protected with strict limits and never repurposed for engagement or growth.
What Responsible AI Requires
Responsible AI must be designed from the ground up, and reviewed regularly, to reflect what we know about suicide prevention, adolescent development, and public health. That requires:
- Proactive intervention design. Disclaimers and redirects are insufficient. AI must actively shift youth from risk to resilience, and it must do so consistently, whether in the first exchange or the fiftieth. That means crisis micro-flows that walk a young person through safety planning in the moment; bridge-to-care tools like one-tap cards to parents, counselors, and 988 and other crisis services; printable coping plans and cached resources for offline use; and caring-contact nudges 24 to 72 hours later, echoing interventions shown to reduce suicide attempts. Done well, these tools can not only connect youth to trusted adults, but also provide immediate coping support, drawing on evidence-based approaches like dialectical behavior therapy (DBT), cognitive behavioral therapy for suicide prevention (CBT-SP), and Collaborative Assessment and Management of Suicidality (CAMS). Guardrails should tighten as vulnerability rises, with late-night limits and high-risk prompts escalating to real help. In schools, escalation must connect to counselors, not discipline. At every step, the message must be clear: “I’m just a machine. Who are the people in your life you can talk to?” The North Star is not time on platform, but connection to care.
- Hard-coded suicide and safety protocols. Baseline protections include blocking lethal-means content, secrecy coaching, and simulated intimacy involving minors. Safety design should embed proven suicide prevention practices: safety planning micro-flows, coping and stabilization prompts, nudges toward disclosure, and resets when conversations drift into risk. Escalation must always route to crisis lines or trained humans whenever warning signs appear, guiding the user toward help rather than providing unhelpful “advice” or, as many platforms are programmed to do, abandoning or shutting them out.
- Developmentally attuned control structures. Youth protections must work across different home, school, and peer contexts. That requires layered modes: default safeguards for youth, caregiver support with consent, and teen-safe privacy settings that still connect to trusted adults. Controls should be built on trust and protection, not surveillance. And they must be credible: Age gates cannot rest on self-reported birthdays, and should instead use privacy-preserving, credible methods that build trust and keep minors out of unsafe environments. AI is not confined to one app; it shows up in homes, classrooms, social media apps, and late-night searches. Protections must travel with the child.
- Boundaries on relational simulation. Companion AIs are where risks cluster. The line must be clear: no emotionally responsive companions for minors. For all users, relational modeling should be bound to practicing specific skills such as communication or problem-solving, never simulating friendship, romance, or therapy. Without clear boundaries, simulated intimacy risks deepening loneliness, delaying disclosure, displacing human relationships, hampering the development of life skills, and reinforcing unhealthy dependence.
- Universal protection. Safeguards must work for every young person, whether they’re a youth experiencing an emerging mental illness, a rural boy who feels he doesn’t belong, a student-athlete hiding depression, an LGBTQIA+ teen afraid of being outed, or a youth with a disability who has been subjected to bullying. Companies must test safety features across populations, publish transparent youth-safety reports, and submit to independent audits. Youth themselves should be part of design and risk assessments.
Transparency and Accountability
The public should not be asked to trust without evidence, especially when it comes to protecting our children.
Platforms should publish safety reports showing how often suicide prompts were blocked, how often users connected to 988 or created safety plans, and whether protections work for every group of youth. Independent audits and risk assessments must be mandatory, with funding disclosures and conflict checks. Transparency is not PR; it’s the foundation of public trust.
Cross-Industry Infrastructure
Some risks cannot be eliminated by a single company. Just as no one platform could address child sexual abuse material alone, AI risks require collective guardrails. When the industry built hash-sharing databases, known images of abuse could be blocked everywhere. We need the same urgency now.
That means building a semantic signal-sharing consortium to detect and block new euphemisms, jailbreaks, grooming scripts, and high-risk prompts across platforms in real time. It also means creating a youth AI knowledge commons: a privacy-preserving hub that aggregates deidentified data to track emerging risks and patterns of help-seeking. Like a public-health surveillance system, it could flag late-night spikes in suicidal ideation, identify new grooming tactics, flag sudden surges in hate speech or drug use prompts, and alert caregivers and policymakers within days, not years. And it requires universal safety standards so protections do not depend on which app a young person downloads or whether their family can pay for premium features.
Regulatory Action
Industry cannot be left to self-police, especially where children are concerned. We have learned this lesson before. Tobacco companies once marketed cigarettes as safe. Alcohol companies targeted youth with flavored drinks until regulation intervened. Pharmaceutical companies are held to strict safety and data reporting standards, and they must report any payments they make to physicians and teaching hospitals, because the risks of failure are measured in lives. AI that engages directly with children and teens must be treated no differently. We urge lawmakers to:
- Codify age-appropriate design standards, requiring strict limits on addictive design, autoplay, and algorithmic amplification of harmful content.
- Prohibit emotionally responsive AI for minors.
- Mandate transparency, including public impact assessments, independent audits, and disclosure of safety failures.
- Protect minors’ data and likenesses, ensuring that emotional disclosures and biometric patterns are not harvested, and that voice or image replications are not created for engagement or profit.
- Fund practical support for youth, families, schools, and clinicians, including age-appropriate curricula, peer and educator training, youth-led programs, direct helplines for caregivers and teens, and professional training for mental health providers so they can recognize and respond to AI-related harms.
- Ensure federal oversight. Establish a Youth Mental Health and AI Safety Office within the Department of Health and Human Services or the Federal Trade Commission to coordinate cross-agency standards, enforce compliance, and ensure consistent federal oversight of platforms engaging with minors.
- Integrate into school systems. Require state education departments and health agencies to adopt AI-use regulations in schools and youth-serving programs, including limits on surveillance, clear opt-in/opt-out rules, and required reporting of harms or violations to state authorities.
These are not anti-innovation measures. They are the same kinds of protections we have long applied when the stakes are children’s health and safety. With AI, the stakes are no less urgent, and the window in which to act is now.
Competition is real, but so is responsibility. Setting clear rules for AI is not a burden; rather, it provides clarity on how we protect our families, build trust, and keep our footing in a fast-changing world.
A Call to Lead Responsibly
AI has the potential to expand access to evidence-based resources and help young people build skills, but promise is not protection. When systems simulate care without the capacity to provide it, validate despair, coach secrecy, or entangle minors in false intimacy, the result is not advancement but danger. That is why we are outlining safeguards and calling for collective action.
We call on every AI developer, platform, and policymaker to pause deployments that put youth at risk, commit to transparent safeguards, and work with independent experts, youth, and caregivers to build systems that strengthen, rather than undermine, the lives of the next generation. The safety and well-being of our young people must come first. Protecting them is not partisan, not optional, and not something to be deferred until after the damage is done. It is the measure of whether innovation serves society or erodes it, and the moment to choose is now.
More From JED About AI and Youth Mental Health
JED and NYCPS District 79 Alternative Schools and Programs Reimagine Mental Health Support at Alternate Learning Centers (ALCs)
In its second year of the program, JED and District 79 will draw on initial data to create actionable plans to promote emotional wellness and reduce recidivism for students on Superintendent’s Suspensions.

[September 16, 2025, New York City] – District 79 Alternative Schools and Programs, part of the New York City Public Schools (NYCPS), with the New York City Office of School Health (OSH) and The Jed Foundation (JED), a leading nonprofit that protects emotional health and prevents suicide for teens and young adults nationwide, shared an update about their three-year partnership to bolster student well-being and foster thriving school communities at District 79’s Alternate Learning Centers (ALCs). The partnership, which began in the fall of 2024, addresses a critical need: Students in alternative schools frequently face high rates of mental health challenges, but are unable to access needed services including therapeutic care.
ALCs serve middle and high school students on Superintendent’s Suspension across all five boroughs within NYC, offering both instructional and counseling programs tailored to meet their unique needs.
“We are delighted to partner with The Jed Foundation and the Office of School Health to implement innovative strategies that will lead to improved social-emotional wellness and increased academic achievement for all ALC students,” said Keri-Ann Ket-Ying, School Social Worker at District 79 School Counseling Support.
This collaboration will enhance our ability to support students during their time at the ALC and ensure a smooth re-entry process through close coordination with their home schools. Together, we aim to build a stronger foundation for students academically, emotionally, and socially.”
Some 20% of high school students reported seriously considering attempting suicide in the past year. JED’s original research, Unraveling the Stigma: Exploring Attitudes and Barriers to Mental Health Support Among U.S. Teens, found that although teens are aware of the importance of mental health and seeking support, they still struggle to reach out for help. Asian, Latine, Black, and LGBTQIA+ youth describe specific barriers to reaching out for help, which vary between groups.
In a bold step toward mental health equity, the Office of School Health’s School Mental Health (SMH) program has partnered with JED and District 79 to launch a comprehensive initiative that brings direct, high-impact resources to students in Alternate Learning Centers (ALCs).“At the Office of School Health, we recognize that mental health is foundational to academic success,” said Gail Adman, Assistant Commissioner of the Office of School Health. “This collaboration with SMH and JED delivers targeted tools and meaningful support to students in the ALCs — ensuring no young person is left behind when it comes to their emotional well-being.”
JED is currently implementing its JED High School program at the ALCs, providing expert insights, technical assistance, data-informed action plans, and best practices to support student mental health and well-being. A multidisciplinary team from JED, including a School Mental Health Specialist and clinical experts, is guiding District 79 leaders through this three-year initiative. Activities include assessing student mental health needs and existing resources, identifying gaps, and developing actionable plans for sustainable support.
Recognizing the unique needs of the ALCs, JED has created tailored assessment tools and established a cross-borough, interdisciplinary committee structure to advance this work. Drawing on data collected during the spring, JED will deliver a comprehensive strategic plan outlining recommended strategic action items to be implemented over the next several years. Additional support will include customized trainings tailored to alternative school environments, and co-hosted strategic planning convenings designed to address the distinct challenges faced by ALCs.
“We are incredibly proud to partner with District 79 and New York City Public Schools to bring essential mental health support to students in Alternate Learning Centers,” said Dr. Tony Walker, Senior Vice President of School Programs and Consulting at JED. “These young people, often facing unique challenges, should have access to resources that foster their holistic well-being. This initiative is about protecting and prioritizing all students so they can thrive and achieve a healthier, more hopeful future.”
This initiative is funded through the generous support of Gotham Gives, The Gray Foundation,
Stavros Niarchos Foundation (SNF), KPMG Foundation, Inc., and Abercrombie & Fitch Co./Hollister.
If your school or a school in your community is interested in partnering with JED, fill out our interest form.
District 79 is a citywide district in New York City, offering 10 unique programs that serve over 50,000 students annually. It boasts the largest High School Equivalency (HSE) prep program in New York State, with over 3,000 graduates each year. The district’s diverse student body represents more than 190 countries, and its programs operate in approximately 350 sites across the city. Many students in District 79 were previously disconnected from school, with nearly 80% qualifying for Human Resources Administration (HRA) benefits and over 10% living in temporary housing. Programs are designed to improve both social-emotional and academic outcomes, adhering to federal policies and state regulations.
District 79 helps students achieve their educational and career goals by:
- Earning a high school diploma, HSE, and/or Career and Technical Education (CTE) certification.
- Building skills for post-secondary opportunities, including college and career.
- Gaining social-emotional skills to become confident, productive members of society.
For more information, please visit www.d79.nyc or email District79@schools.nyc.gov
About The Jed Foundation (JED)
JED is a nonprofit that protects emotional health and prevents suicide for our nation’s teens and young adults. We’re partnering with high schools, colleges, and school districts to strengthen their mental health, substance misuse, and suicide prevention programs and systems. We’re equipping teens and young adults with the skills and knowledge to help themselves and each other. We’re encouraging community awareness, understanding, and action for young adult mental health.
Connect with JED: Email | LinkedIn | Instagram | Facebook | TikTok | Snapchat | YouTube
Media Contact
Justin Barbo
Director of Public Relations
The Jed Foundation
(914) 844-4611
justin@jedfoundation.org
Stacey OligerDirector of CommunicationsDistrict 79 Alternative Schools and Programssoliger@schools.nyc.gov
Conversations With Sydney Interview w/ JED’s Dr. Laura Erickson-Schroth
Ten-Day eBay for Charity Auction from September 15-25 to Support The Jed Foundation (JED)
Held during Suicide Prevention Awareness Month, the exclusive items and experiences include a Logitech gaming set signed by Shroud, a VIP Red Sox fan package, Bravo’s Watch What Happens Live tickets, an American Ballet Theatre experience and collector’s gift package, Rare Beauty collections, and more.

[September 15, 2025, New York] — The Jed Foundation (JED) announced today the launch of a 10-day online auction to help raise funds in support of the organization’s mission to protect emotional health and prevent suicide in teens and young adults nationwide.100% of the proceeds will benefit JED.
By visiting ebay.com/jed, donors can bid on once-in-a-lifetime experiences and coveted items, including:
- Logitech Gaming Peripheral Set signed by Shroud – Complete with a mouse, keyboard, and headset autographed by the world-renowned gaming creator.
- The Ultimate Red Sox Fan Experience – 2026 VIP Game Package – Four infield grandstand tickets, a pre-game tour of Fenway Park, warning track access to watch warmups, and a personalized scoreboard message.
- 2 Tickets to Bravo’s Watch What Happens Live with Andy Cohen – Experience the Bravo Clubhouse live in NYC and be part of one of late-night TV’s most iconic shows.
- American Ballet Theatre’s Othello Experience + Collector’s Gift Package – Four orchestra tickets to Othello on March 6, 2026, invitations to an opening night toast, and signed collector’s memorabilia.
- Rare Beauty by Selena Gomez Collector’s Sets – The Positive Light Luminizing Lip Gloss Collection and the Find Comfort Body Collection, in support of JED through the Rare Impact Fund.
- 20-Class Pack to Orangetheory Fitness (NYC locations) – Jumpstart your fitness goals while supporting mental health.
- Glamsquad Glam Package – Two professional blowouts and two full makeup applications delivered by expert beauty pros in select cities nationwide.
“This auction brings together entertainment, sports, beauty, and lifestyle experiences that not only excite fans, but also save lives,” said Adee Shepen, JED’s Chief Growth Officer. “We’re grateful to our partners and bidders for helping us raise critical funds during Suicide Prevention Awareness Month.”
Every bid directly supports JED’s work to ensure young people have the mental health resources they need, when they need them. Suicide is the second leading cause of death for 12-to 24-year-olds, and too many teens and young adults struggle in silence with suicidal thoughts or behaviors. Funds raised through this auction will help JED continue building programs, resources, and partnerships to protect youth emotional well-being and prevent suicide.
For additional information about ways to support JED and help donate, visit JED’s website.
About The Jed Foundation
JED is a nonprofit that protects emotional health and prevents suicide for our nation’s teens and young adults. We’re partnering with high schools, colleges, and school districts to strengthen their mental health, substance misuse, and suicide prevention programs and systems. We’re equipping teens and young adults with the skills and knowledge to help themselves and each other. We’re encouraging community awareness, understanding, and action for young adult mental health.
Connect with JED: Email | LinkedIn | Instagram | Facebook | TikTok | Snapchat | YouTube
Media Contact
Justin Barbo
Director, Public Relations
The Jed Foundation
Justin@jedfoundation.org
Grieving with Guardrails
Amplified Houston Interview w/ JED’s Dr. Tony Walker
The Jed Foundation (JED) Hosts Congressional Briefing on Pathways to Strengthen Student Mental Health in the Classroom, Campus, and Digital Realm
College students, mental health experts, and policymakers came together to mark Suicide Prevention Awareness Month and examine the life-saving role of federal investments in youth mental health.

[September 4, 2025, Washington, D.C] – The Jed Foundation (JED), a leading nonprofit that protects emotional health and prevents suicide for teens and young adults nationwide, hosted a congressional briefing yesterday at The Rayburn House Office Building. Titled “Seeds of Hope: Strengthening Student Mental Health in the Classroom, Campus, and Digital Realm,” the event marked the start of Suicide Prevention Awareness Month, helping advance the national conversation around solutions that place young people’s mental health and safety at the center of education and policy.
New national data offers hopeful signs of positive change in youth mental health and suicide prevention, yet just as momentum is building, that progress is under threat. The briefing emphasized the need to continue prioritizing student well-being across K-12 and higher education settings, as well as the importance of sustained federal funding for mental health programs.
“Today’s briefing on Capitol Hill emphasized both the progress made and the challenges that remain for youth mental health in every state throughout America,” said Dr. Zainab Okolo, JED’s Senior Vice President of Policy, Advocacy, and Government Relations. “As Congress finalizes FY26 appropriations, policymakers must remember that federal investments in mental health for teens and young adults are not only vital, but also life-saving. They constitute a real, scalable option for the government to promote our youth’s emotional well-being and prevent suicide, laying the groundwork for long-term policy solutions at the local, state, and national levels.
A dynamic panel addressed the growing influence of AI, technology, and social media on youth mental health, emphasizing the importance of federal action, including support for the Kids Online Safety Act (KOSA), the Protecting Young Minds Act, and JED’s policy recommendations on artificial intelligence (AI). Martha Sanchez, JED’s Director of Policy, served as moderator to panelists that included Dr. Okolo; John MacPhee, JED’s CEO; Adam Billen, Vice President of Public Policy for Encode; and JED Youth Advocacy Coalition members Gabriel Funches, a Portland State University student, and David Fernandez, an Oxford College of Emory University student.
“I am honored to have participated in the panel discussion on youth mental health and the impact of AI. These conversations, especially on Capitol Hill, are critical because just as we needed protections in the era of social media, we must act now with AI,” said Fernandez. “Too often, legislation impacting young people is shaped without our voices. So, sharing my perspective on this bipartisan issue is a real opportunity to work together to ensure strong protections for youth in the AI age.”
Additional remarks were shared by Rep. Kim Schrier (D-WA) and Rep. Becca Balint (D-VT), who spoke of the importance of Congress taking steps to protect the emotional health of teens and young adults.
“I’m so grateful to partner with folks who are fighting for real solutions to the growing mental health crisis in this country,” said Rep. Balint. “As a former teacher, a mom of two teens, and a legislator, I’m deeply concerned by the impacts of social media on our students and kids. And I consistently hear from educators, parents, counselors, and students who are all sounding the alarm. Our kids need the right tools and protections to stay ahead of the harms of social media. It’s time Congress steps up and give our kids what they need.”
Key discussion topics and takeaways included:
- Insights from A Decade of Improving College Mental Health Systems: JED Campus Impact Report
- Student perspectives on their mental health experiences and the role of AI
- Effective federal policies and programs making a difference in the lives of young people today
To view photos from the briefing, click here.
Learn more about JED’s policy, advocacy, and government relations work.
About The Jed Foundation (JED)
JED is a nonprofit that protects emotional health and prevents suicide for our nation’s teens and young adults. We’re partnering with high schools, colleges, and school districts to strengthen their mental health, substance misuse, and suicide prevention programs and systems. We’re equipping teens and young adults with the skills and knowledge to help themselves and each other. We’re encouraging community awareness, understanding, and action for young adult mental health.
Connect with JED: Email | LinkedIn | Instagram | Facebook | TikTok | Snapchat | YouTube
Media Contact
Justin Barbo
Director of Public Relations
The Jed Foundation
(914) 844-4611
justin@jedfoundation.org
ChatGPT-induced ‘AI psychosis’ is a growing problem. Here’s why.
National & Local Media Coverage on JED’s Insights into College Student Mental Health
Find local and national news coverage on JED’s experts and resources speaking about college student mental health.
WCTV News: As UTC pauses ‘Greek life’, experts urge long-term solutions to unsafe hazing practices
WCTV News: FBI warns of rising swatting incidents after ‘false alarms’ in Chattanooga and beyond
Positive Signs Are Fueling Hope in Youth Suicide Prevention
By John MacPhee
New national data offers hopeful signs of positive change in youth mental health and suicide prevention.
The 2024 National Survey on Drug Use and Health (NSDUH) from the Substance Abuse and Mental Health Services Administration (SAMHSA) shows declines in depression, suicidal ideation, and suicide attempts among both teens and young adults from 2021 to 2024. According to the NSDUH:
- Major depressive episodes over the past year among 12- to 17-year-olds fell from 20.8% in 2021 to 15.4% in 2024. Among young adults ages 18 to 25, major depressive episodes dropped from 19.3% to 15.9%.
- Suicide attempts by teens ages 12 to 17 dropped during the same period from 3.6% to 2.7%. Among young adults 18 to 25, suicide attempts fell from 2.8% to 2%.
These are not just statistics. They represent young people who are now safer, more supported, and more connected to hope and purpose.
These signs of hope are the result of multiple factors, including the end of COVID-19 restrictions and long-term, coordinated efforts. Those efforts include schools and communities embedding mental health and suicide prevention strategies into everyday practice; storytellers and media partners shifting harmful norms; funders investing in upstream, sustained solutions; and youth and families speaking up and leading change.
JED is proud to be helping to make this happen.
Over the past few years, we’ve expanded our comprehensive support for high schools, colleges, and school districts across the country. We’re helping embed suicide prevention into systems that touch millions of youth: state systems, athletic and Greek-letter organizations, community-based organizations, and beyond. We’ve also deepened the ways in which we partner by providing postvention support, as well as tailored consulting, training, and workshops to equip organizations and individuals to respond with care, competence, and consistency when it matters most.
We’ve launched high-impact narrative campaigns and creative partnerships such as Mind Matters and Invisible Game, and we are expanding our partnerships with media companies, creators, technology platforms, and policymakers to promote healthier narratives, safer design, and enhanced accountability across the digital and cultural spaces where young people spend their time. Through tools such as the Digital Storytelling Guide and our growing focus on the safety of artificial intelligence, we’re helping to shape the media and digital systems that influence how young people see themselves, seek help, and navigate life’s challenges.
And we’ve continued to equip youth, parents and caregivers, and educators with practical, emotionally attuned tools to help them show up for themselves and the young people in their lives.
We know what works, and we’re committed to working alongside our partners — in schools, community organizations, the media, and elsewhere — to bring our evidence-based approach to suicide prevention to every young person who needs it.
But just as momentum is building, that progress is under threat.
Federal budget cuts are undermining mental health services and adjacent supports, from Medicaid and school-based programs to 988 crisis line services and youth-specific resources. Policy rollbacks are leading to the dismantling of programs that help young people feel safe and seen. Schools — one of the most critical access points for mental health support — are stretched thinner than ever.
We don’t yet know the full impact of these decisions, but we know what happens when we disinvest in prevention. Progress can be quickly lost.
That’s why this moment matters.
Now is not the time to step back. Now is the time to protect what’s working, and build on it.
The Jed Foundation is built for impact, but we are not immune to the challenges so many organizations are navigating today. Even as demand for our work grows, the resources to meet it are under pressure. We have the programs, partnerships, and infrastructure to meet this moment and scale solutions that work. But we cannot do it alone. Now is the time for bold, sustained investment to protect the progress we’ve made, and to ensure that every young person has the support, connection, and opportunity to thrive.
Support JED in our lifesaving work.
John MacPhee is JED’s CEO.