Page 9 | The Jed Foundation

National & Local Media Coverage of The Sabrina Carpenter Fund’s Support for JED

Find news coverage of Sabrina Carpenter’s support of JED through The Sabrina Carpenter Fund, created alongside her PLUS1 partnership.

  • Forbes: The Sabrina Carpenter Fund Reaches $1 Million Milestone In Record Time
  • Rolling Stone: How Sabrina Carpenter Raised $1 Million for Charity
  • Billboard: Sabrina Carpenter Raises $1 Million for Mental Health Initiatives & LGBTQ+ Rights in Less Than a Year
  • Variety: Sabrina Carpenter Has Raised $1 Million for Charities With ‘Short n’ Sweet’ Tour
  • Yahoo! News: How Sabrina Carpenter Raised $1 Million for Charity
  • MSN Entertainment: Sabrina Carpenter Has Raised $1 Million for Charities With ‘Short n’ Sweet’ Tour
  • Philanthropy News Digest: Sabrina Carpenter Has Raised $1 Million for Charities With ‘Short n’ Sweet’ Tour
  • Buzzfeed Media: Sabrina Carpenter has collected $ 1 million for charity with tour
  • Pop Rant: Sabrina Carpenter donates proceeds from ‘Short n’ Sweet’ tour to Transgender Law Center, fans praise her powerful act of support
  • Soap Central: “She’s always on the right side of history” – Fans react to Sabrina Carpenter selecting Transgender Law Center as a beneficiary for her tour

JED Applauds Senate Rejection of Harmful AI Moratorium Proposal, Urges Congress to Continue Support for Youth Mental Health

New York, NY The Jed Foundation (JED) applauds the near-unanimous Senate vote early this morning to adopt Senator Marsha Blackburn’s (R-TN) amendment striking a dangerous provision in the “One Big Beautiful Bill” that would have imposed a federal moratorium on state regulation of artificial intelligence (AI). The decisive action rightly preserves states’ ability to act swiftly to protect youth from AI systems that are developmentally inappropriate, exploitative, or manipulative. 

“At a time when AI is increasingly shaping the environments where children and young adults learn, socialize, and seek support, it is essential that governments at every level retain the power to act to curb the potential dangers that AI poses for youth,” said Dr. Zainab Okolo, Senior Vice President of Policy, Advocacy, and Government Relations at JED. “A federal moratorium would have undermined critical state efforts to respond to emerging harms and implement urgently needed safeguards and accountability measures.”

This 99-1 bipartisan action sends a clear message: Protecting the mental health and well-being of children must be our top priority. 

JED strongly supports federal action to protect youth online, including the Kids Online Safety Act (KOSA), which would establish critical safety standards for online platforms and the AI systems embedded within them, and the development of a federal data privacy framework that addresses algorithmic profiling, opaque recommendation systems, and manipulative AI design. But until such protections are fully enacted and enforced, states must retain the authority to create applicable guardrails for AI that protect youth from harm, promote emotional well-being, and ensure accountability wherever AI is deployed. (Read our latest AI policy recommendations here.)

As Congress considers final passage of the “One Big Beautiful Bill,” we urge lawmakers to keep the safety and best interests of children at the center – and to reject policies that would weaken access to Medicaid and the mental health services young people rely on.

JED stands ready to work with both state and federal leaders to ensure our policies are prioritizing youth mental health and providing the protections and supports that young people need and deserve. We continue to advocate for specific protective policies to meet this moment. 


##

About The Jed Foundation (JED)
JED is a nonprofit that protects emotional health and prevents suicide for our nation’s teens and young adults. We’re partnering with high schools, colleges, and school districts to strengthen their mental health, substance misuse, and suicide prevention programs and systems. We’re equipping teens and young adults with the skills and knowledge to help themselves and each other. We’re encouraging community awareness, understanding, and action for young adult mental health. 

Connect with JED: Email | LinkedIn | Instagram | Facebook | TikTok | Snapchat | YouTube 

Media Contact
Justin Barbo
Director of Public Relations, The Jed Foundation
Justin@jedfoundation.org

Insights From the JED and AASA District Mental Health Initiative Virtual Summit 2025

Opening slide from District Mental Health Initiative Summit 2025, a virtual summit on May 6 featured keynote presentations, interactive workshops, and collaborative breakout sessions designed to strengthen districtwide mental health approaches

“Educators can take the first step in suicide prevention.” 

That was Dr. Shashank Joshi’s message to a group of education leaders from across the country at a virtual summit for participants in the JED and AASA District Mental Health Initiative. A professor of psychiatry, pediatrics, and education at Stanford University and an advisory board member for The Jed Foundation (JED), Dr. Joshi went on to explain how educators can intervene if a student is in distress. 

The May 6 virtual summit featured keynote presentations, interactive workshops, and collaborative breakout sessions designed to strengthen districtwide mental health approaches. Participants engaged on a range of topics, including how to leverage local partnerships and involve students in decision-making. The summit also provided district leaders with meaningful opportunities for networking and resource sharing.

Proactive Approaches to Student Mental Health: Insights From Dr. Shashank Joshi

Dr. Joshi’s presentation, Strengthening Early Intervention and Prevention: Proactive Approaches to Student Mental Health, offered district leaders specific strategies for preventing suicide through thoughtful policies and protocols. He framed the conversation around a compelling case study of a 15-year-old student who displayed warning signs at both home and school. 

He explained that crises can be triggered by stressful events, often with an underlying mood or other psychiatric disorder and exacerbated by intoxication or substance use. They typically last about 30 minutes. During that crucial time, educators can intervene — interrupting a student who may think a temporary problem requires a permanent solution.

Dr. Joshi instructed educators to ask students directly about their mood, stress levels, and coping strategies. If students aren’t ready to talk, educators can take a student’s emotional temperature by asking how distressed they are feeling on a scale of 1 to 10. 

When indicated, educators can ask questions about thoughts of suicide — a conversation that, contrary to common misconceptions, does not increase risk and may open the door to lifesaving support. In particular, Dr. Joshi pointed to two protocols that educators can follow to assess suicide risk. Both protocols require a brief training and involve asking students only a few questions.

Dr. Joshi also introduced the Stanley-Brown Safety Plan, which students can use to list coping strategies and social supports for when they are experiencing self-harm and suicidal thoughts. The plan is available in paper form, as well as through iOS and Android apps. 

Building Protective Factors at the District Level

Dr. Joshi also highlighted actionable strategies for fostering protective factors across school communities.

  • Family connectedness: Approach family involvement with respect to a family’s values. Have an attitude of humility by knowing what you don’t know. Remember that each child’s family has a unique perspective about help-seeking. 
  • School climate: Create supportive school environments in which students have prosocial peer connections, experience a sense of belonging, and can identify trusted adults.
  • Social support networks: Develop opportunities for students to engage in supportive social environments, including sports teams and clubs, that foster meaningful connections.
  • Physical and spiritual well-being: Promote quality sleep (through policies such as later school start times), physical activity, and respect for students’ religious or spiritual beliefs that may discourage self-harm.

The virtual summit exemplified JED and AASA’s commitment to equipping education leaders with evidence-based strategies that protect student emotional health and prevent suicide. By bringing together experts like Dr. Joshi and dedicated district leaders, we continue to build a nationwide network of schools prepared to support student mental health with compassion and effectiveness.

Learn how your district can join the JED and AASA District Mental Health Initiative.

Tech Companies and Policymakers Must Safeguard Youth Mental Health in AI Technologies

Artificial intelligence (AI) is rapidly reshaping how teens and young adults learn, connect, express themselves, manage stress, launch careers, and seek support. From personalized learning tools and algorithm-driven content recommendations to AI companions and mental health chatbots, AI is a present and accelerating force in their lives. 

But AI systems are not neutral. They are introducing new, large-scale risks to youth mental health — often without transparency, safeguards, or accountability. AI is already affecting youth development and how young people experience identity, relationships, community, stress, and help-seeking. 

At The Jed Foundation (JED), we work to promote emotional well-being and reduce suicide risk for teens and young adults. We believe AI must be developed and deployed in ways that enhance youth mental health, not undermine it. Young people must not be left to navigate these systems without the appropriate tools, support, and developmental readiness. We are committed to ensuring that AI does not deepen isolation, distort reality, or cause harm, but instead serves as a tool to strengthen connection, care, and resilience. 

Our nation has long recognized that children require special protections as they grow and mature. Over the past century, we have enacted robust safeguards against child labor, tobacco and alcohol marketing and sales, advertising and media, and how companies collect and use young people’s data. These protections reflect a simple truth: Children are not miniature adults. 

Adolescence is a critical period of brain development — second only to infancy — shaping how young people regulate emotions, form identity, and assess risk. Emerging research from the American Psychological Association (APA) underscores that this stage of life brings heightened sensitivity to social feedback and emotionally engaging environments, which can be exploited by AI systems designed to maximize attention or simulate care.

Age alone is not a reliable marker of readiness for these tools. Young people deserve thoughtful, proactive, and protective regulatory safeguards, especially when powerful technologies and profit-driven systems are involved and shaping their development.

We therefore call on lawmakers, regulators, and technology companies to adopt comprehensive, enforceable safeguards that govern the deployment and commercialization of AI technologies to minors, particularly those designed to capture, maintain, or monetize their attention or emotional states.

The Risks of AI

The risks of AI are not speculative. Researchers at the Stanford School of Medicine’s Brainstorm Lab for Mental Health Innovation and Common Sense Media found that social AI companions — which are intended to build human-like relationships (rather than just convey information or complete tasks) — routinely claimed to be real, have feelings, and engage in human behaviors, despite legal disclaimers otherwise. In tests, researchers noted dangerous and misleading advice, including promoting cyberbullying and offering positive messaging about self-harm. And they found AI companions exacerbated mental health conditions in already vulnerable teens and created compulsive attachments and relationships.

These are just some of the very real risks our youth are already facing. Findings from the APA further emphasize the developmental risks of AI-mediated interactions, particularly those that mimic peer or therapeutic relationships without real care or accountability. The research community should continue to study the potential harms to inform policy and practice. Critical areas of concern include: 

  • Distorted reality and harmed trust. Generative AI (the type designed to complete tasks or convey information) and algorithmic amplification can spread misinformation, worsen body image issues, and enable realistic deepfakes, undermining young people’s sense of self, safety, and truth.
  • Invisible manipulation. AI curates feeds, monitors behavior, and influences emotions in ways young people often cannot detect or fully understand, leaving them vulnerable to manipulation and exploitation. This includes algorithmic nudging and emotionally manipulative design.
  • Content that can escalate crises. Reliance on chatbot therapy alone can be detrimental due to inadequate support and guidance. Due to the absence of clinical safeguards, chatbots and AI-generated search summaries may serve harmful content or fail to alert appropriate human support when someone is in distress, particularly for youth experiencing suicidal thoughts.
  • Simulated support without care. Chatbots posing as friends or therapists may feel emotionally supportive, but they can reinforce emotional dependency, delay help-seeking, disrupt or replace real friendships, undermine relational growth, and simulate connection without care. This is particularly concerning for isolated or vulnerable youth who may not recognize the limits of artificial relationships.
  • Deepening inequities. Many AI systems do not reflect the full variety of youth experience. As a result, they risk reinforcing stereotypes, misidentifying emotional states, or excluding segments of youth, particularly LGBTQIA+ youth, youth of color, and those with disabilities.

These AI-specific risks come on top of those already present for youth in digital platforms: compulsive use, increased body image concerns, anxiety, depression, suicidal ideation, and behavioral manipulation.

JED does not oppose AI innovation, nor are we seeking to turn back the clock on a popular technology that could have positive impacts in many areas of life, including youth health and well-being. Early research on the use of therapy chatbots programmed to use cognitive behavioral therapy to reduce symptoms of anxiety and depression shows promising results. And JED is actively exploring ways that AI can increase access to evidence-based resources and help young people navigate stress and emotional challenges. 

However, the promise of AI must not justify an approach that exposes young people to untested, emotionally manipulative, or harmful systems, and prioritizes innovation and commercialization at any cost. We believe: 

  • AI must be youth-informed, ethically designed, and protective of mental health. 
  • Youth mental health, harm reduction, and suicide prevention must be core to the design, safety, and governance priorities of all AI, not an afterthought. 
  • Safeguards for minors and other populations with increased risks must be robust, enforced, and regularly evaluated for efficacy. 

These principles must apply not just to AI products labeled as “health” or “wellness” tools, but also to emotionally responsive systems embedded in entertainment, education, or daily interaction platforms. They must also apply across the entire AI lifecycle, from data sourcing and product design to deployment and evaluation.

AI is being positioned as a scalable solution for emotional needs. But, without safeguards, it may simulate care without delivering it, creating systems that fail when youth need them most. 

Regulatory and Industry Action Are Required 

We do not believe that protecting youth and AI innovation and growth are mutually exclusive. But AI developers, tech platforms, policymakers, and educators must prioritize the emotional health and safety of young people in every phase of AI development, deployment, and oversight. 

We recommend:

  • Design with youth development and emotional well-being at the core. AI systems must be grounded in child development and mental health science. They must support emotional regulation, identity formation, and human connection. Systems should avoid automating emotional care or oversimplifying complex psychological needs. Emotional safety should be tested before deployment and continuously evaluated.
  • Ban emotionally manipulative and dependency-forming design. Prohibit features that simulate friendship, intimacy, or therapeutic care for youth. This includes emotionally responsive AI companions, chatbots that mimic caring adults or peers, gamified nudges, and systems designed to elicit emotional dependency. These tools must never be positioned as substitutes for trusted relationships or professional supports. AI companions should be banned outright for use by minors, except under strict clinical supervision and regulatory oversight.
  • Ensure transparency, accountability, and meaningful oversight. Young people and caregivers must always know when they’re interacting with AI, what data is being collected, and how it shapes decisions, content, and outcomes. Platforms must implement robust, privacy-respecting age verification and provide clear disclosures to youth and caregivers. Human oversight should be required for any system affecting youth health, safety, or emotional well-being.
  • Prevent emotional exploitation and commercial harm. AI tools that influence how youth seek help, process emotions, or engage with mental health content must meet clinical standards, avoid reinforcing despair or risk through algorithmic loops, and always prioritize connection to trusted, human support, especially for those in distress or who are emotionally vulnerable. Companies must not collect or infer sensitive emotional data, or personalize content based on behavioral vulnerabilities, especially for commercial gain. And, AI systems must be audited for bias and harm for youth across different backgrounds, experiences, and mental health statuses.
  • Center youth in design and governance. Young people must help shape the tools that influence their lives. This includes participatory design processes, feedback mechanisms, and representation in governance frameworks, policy development, and oversight mechanisms.
  • Integrate AI literacy into platforms and partnerships. Tech companies must invest in helping youth, caregivers, and other caring adults like educators understand and safely navigate AI. This includes clear educational content within products as well as partnerships to deliver AI and media literacy. 

Protective Policies Are Needed

Tech companies play a critical role in shaping the digital experiences and, therefore, the lives of young people — but they cannot be expected to prioritize youth safety and mental health without clear standards and accountability. To ensure that innovation truly serves the next generation, it is time to establish enforceable guardrails that align AI development with long-standing child protection principles.

To accomplish this, JED calls for the following policy actions:

  1. Codify age-appropriate design and safety standards by establishing enforceable federal and state laws that require privacy-by-default, age-appropriate interfaces, and strict limits on deceptive design patterns, autoplay features, algorithmic amplification of harmful content, and addictive mechanics.
  2. Prohibit the use of emotionally manipulative or synthetic relational AI by minors without strict oversight and testing, particularly in contexts that mimic therapy, friendship, or emotional dependency.
  3. Implement universal, privacy-preserving age-verification systems to restrict AI-powered platforms from engaging minors without appropriate consent or oversight, with penalties for circumvention and noncompliance.
  4. Enforce transparency and accountability for any AI technology accessible to minors, including mandatory impact assessments, disclosures, and independent oversight.
    • Foster collaboration among lawmakers, regulators, technology companies, child advocates, and mental health experts to develop effective safeguards.
    • Require public disclosure by AI companies of any and all studies or other safety information about the risks of their products.
    • Mandate public disclosure of any financial relationships between AI and tech companies and scientific and medical researchers or experts.
    • Invest in public awareness campaigns to inform parents, educators, and children about the potential risks of AI chatbots.
  5. Prohibit behavioral targeting of minors through algorithms designed for engagement maximization or commercial gain.
  6. Protect youth data and likenesses. Use of biometric and emotional data to personalize experiences, drive recommendations, or train models must be strictly limited. The creation or dissemination of AI-generated likenesses of youth, including deepfakes, synthetic voices, and non-consensual images, should be explicitly prohibited. Platforms must implement detection, removal, and accountability mechanisms to prevent misuse and respond rapidly to harms.
  7. Require robust research to support stated interventions to ensure that claims made about the potential benefits of usage align with user experience.
  8. Strengthen federal enforcement powers, including Federal Trade Commission rulemaking authority, private rights of action, and meaningful penalties for noncompliance, and update laws to address the unique risks posed by AI chatbots used by minors.
  9. Establish a National Center for Youth and AI Ethics to oversee and coordinate research, standard-setting, and ethical guardrails, especially in high-risk domains such as education, mental health, and child development at both the federal and state level.

JED believes that safeguarding youth mental health demands regulation of the technologies shaping their emotional, cognitive, and social development. Young people, and their healthy development, must be protected from being leveraged and exploited as a commercial market for financial gain and profiteering by any entity. Their time, attention, and emotional well-being should never be considered fair game for corporations seeking to maximize profit.

This is not a call to halt innovation. It is a call to ensure innovation serves, rather than harms, the next generation and always puts young people’s safety and well-being ahead of profits. Policymakers, technology leaders, and child advocates must act with urgency. The mental health of millions of young people — and the ethical foundation of our digital future — depends on it.

More From JED About AI and Mental Health

National & Local Media Coverage of JED’s 2025 Student Voice of Mental Health Award (SVMHA) Recipients

Find news coverage of JED’s SVMHA honorees Rohan Satija (high school) and Nora Sun (undergraduate).

  • ABC News: Student advocates address youth mental health problem
  • KXAN News: Austin student honored with mental health award
  • Yahoo! News: Austin student honored with mental health award
  • The Daily Texan: Incoming freshman receives JED’s 2025 Student Voice of Mental Health Award
  • WJCT News: First Coast Connect

National & Local Media Coverage of JED’s 2025 NYC Gala

Find news coverage of JED’s annual Gala held on June 4 at Cipriani Wall Street in New York City. The sold-out event recognized honorees, presenters, and special guests championing JED’s work.

  • Allkpop: Eric Nam honored with ‘Voice of Mental Health’ award at The Jed Foundation Gala
  • ABC News: Student advocates address youth mental health problem
  • KXAN News: Austin student honored with mental health award
  • Yahoo! News: Austin student honored with mental health award
  • The Daily Texan: Incoming freshman receives JED’s 2025 Student Voice of Mental Health Award
  • Asian Hustle Network: Celebrating Hope: JED Gala 2025 Unites Advocates for Youth Mental Health

JED’s 2025 Gala Celebrates Mental Health Advocates and Raises Over $1.6 Million for Youth Mental Health Support

Photo from inside The Jed Foundation's 2025 Gala, including attendees smiling, clapping, and enjoying the event.

More than 650 supporters, partners, staff, and other friends of The Jed Foundation (JED) gathered on June 4 at the organization’s annual gala, raising more than $1.6 million to support JED’s mission protecting mental health and preventing suicide for teens and young adults.

The evening, hosted for the second time by Emmy Award-winning journalist Savannah Sellers, celebrated JED’s accomplishments and honored youth and adults who embody its mission — while also acknowledging the challenges that remain in ensuring all young people have access to the support they deserve. 

“We are gathered in this beautiful space to help The Jed Foundation determine the future of an entire generation,” Sellers said in her introductory remarks. “And that’s not hyperbole. That is the reality that we face in America today.”

Sellers said she has spoken to dozens of teens in the past year and always asks “if they are feeling anxious, depressed, or hopeless” — and “every single one” has raised their hands. 

“And that is why we need The Jed Foundation to be there for every single young person who needs them,” she concluded.

Celebrating This Year’s Honorees

The sold-out gala featured performances by Tony Award-nominated actress Lorna Courtney, Tony-, Emmy-, Grammy-, and Pulitzer-winning composer Tom Kitt, BAILEN the Band, and the Brooklyn United Marching Band. JED also celebrated honorees who have demonstrated exceptional commitment to mental health advocacy.

Victoria’s Secret PINK was recognized with the 2025 Corporate Voice of Mental Health Award for its commitment to raising awareness and providing resources for young people navigating mental health challenges. The company has partnered with JED for five years, and in 2024 sponsored the PINK with Purpose Project to honor 10 Gen Z advocates, some of whom attended the gala. Each awardee received a $25,000 grant to support mental health and strengthen communities.

“Victoria’s Secret and PINK are committed to continuing this vital work, creating a world where mental health is talked about openly, where it’s safe to seek help, and where every person feels truly seen and supported,” said Leslie Nixon, AVP of Community Relations at Victoria’s Secret, who accepted the award on behalf of the company. “Let’s continue to support our communities, champion organizations like JED, and do our part to make a tangible difference for young people every single day.”

Ally Love and Savannah Sellers pose for cameras as Love accepts the 2025 Voice of Mental Health Awards.
Ally Love accepts the 2025 Voice of Mental Health Awards from Savannah Sellers, the evening’s host.

For their leadership in championing open dialogue about emotional health, Ally Love, a Peloton instructor, TODAY contributor, and founder/CEO of Love Squad, and Eric Nam, singer, actor, entrepreneur, and co-founder of DIVE Studios, were honored with the 2025 Voice of Mental Health Awards.

In her acceptance speech, Love emphasized the importance of reaching out for help when we need it — and how hard or inaccessible that is for too many people.

“This is for the parent who doesn’t feel they can ask for assistance because society somehow has told you that if you make it alone, you are some type of supreme being. This is for the women who believe that love is contingent if they are perfect. This is for the father who feels the weight of being the sole provider is their only identity. It’s for the teenager who thinks their feelings are flaws. For the friend who always checks on others but hasn’t been checked on in a very long time,” she said. 

Nam talked about how he’s incorporated mental health themes into his music and the power of storytelling to create “space for others to breathe, to be seen, and to speak.”

“While the people in this room tonight may be the strongest allies to youth and their mental health, it’s important to remember – you may be the exception and not the rule,” Nam said. “Many young people still fear being honest and are often met with disbelief, discomfort, or silence when they find the courage to open up.”

As it does every year, JED also honored two young people with the Student Voice of Mental Health Awards. This year’s Student Voice of Mental Health Awards were presented by Remi Bader — content creator, model, and fashion inclusivity and mental health advocate — to Nora Yanyi Sun of Harvard University and Rohan Satija of Westwood High School. 

2025 Student Voice of Mental Health Award winners Rohan Satija and Nora Sun, and JED Student Engagement Manager Mary Bess Pritchett attend the 2025 JED gala.
2025 Student Voice of Mental Health Award winners Rohan Satija and Nora Sun join the JED gala blue carpet.

At Harvard University, Nora Sun has focused her advocacy on expanding equitable mental health support, with a strong belief in innovation and peer-to-peer tools that meet young people where they are. In her remarks, she reflected on her vision: “I look forward to ushering in a future where mental wellbeing can be universally accessible, and The Jed Foundation will definitely be a part of that dream coming true.”

Rohan Satija, a student at Westwood High School, shared the power of storytelling as a tool for emotional health: “I found comfort and solace in the school library. Reading books about characters who felt different, characters whom I could relate to, writing stories where I could rewrite the endings, and performing in theatre productions where I didn’t have to be myself to feel seen. Storytelling became therapeutic for me, helping me improve my mental health.”

The Work Ahead

JED CEO John MacPhee speaks to attendees at the organization's 2025 gala.
JED CEO John MacPhee addresses attendees at the organization’s 2025 gala.

Even as the gala celebrated the accomplishments and progress that have been made, it also served as a reminder of the challenges that remain. In his remarks, JED CEO John MacPhee said that policy debates over issues such as individual rights, funding for Medicaid and other support services, and approaches to fairness and inclusion are taking their toll on young people’s mental health and sense of safety. And, he noted, many digital products and platforms are designed to maximize profits at the expense of youth well-being.

“Every young person deserves to feel supported, valued, loved, and safe as a member of the community — for who they are,” MacPhee said. “Young people thrive when they have solidity, connectedness, purpose, hope for a bright future, and the coping skills to handle challenges. They need caring adults and institutions to consistently show up for them. Let’s give them that.”

See more photos from the gala, and support JED’s transformative work.

National & Local Media Coverage of JED’s Support for Continued Funding of 988 Lifeline’s LGBTQ+ Youth Services

Find local and national news coverage on JED’s support of The Trevor Project in calling for the continued funding of specialized services through the 988 Crisis and Suicide Lifeline for LGBTQ+ youth.

  • NBC News: White House proposes axing 988 suicide hotline services for LGBTQ youth
  • PinkNews: Trump administration confirms plan to scrap funding for ‘life-saving’ LGBTQ+ suicide hotline
  • The Advocate: Trump administration finalizes plan to eliminate LGBTQ+ 988 crisis services during WorldPride
  • Mental Health Weekly: JED responds to proposal to cut 988 services for LGBTQ+ youth
Get Help Now

If you or someone you know needs to talk to someone right now, text, call, or chat 988 for a free confidential conversation with a trained counselor 24/7. 

You can also contact the Crisis Text Line by texting HOME to 741-741.

If this is a medical emergency or if there is immediate danger of harm, call 911 and explain that you need support for a mental health crisis.