Discord, the popular chat app favored by gamers, is rolling out a suite of new safety features aimed at giving parents more insight into their teen’s activity. These changes come as the platform faces increased scrutiny and multiple lawsuits alleging it facilitates the exploitation of young users.

Enhanced Parental Visibility Within the Family Center

The revamped Family Center will now allow parents or guardians to track a week’s worth of their teen’s interactions. Key features include:

  • Top Interactions: Parents can see the five users their teen has messaged or called most frequently.
  • Server Activity: Visibility into the servers where a teen is most active.
  • Communication Minutes: A record of total call minutes spent in both voice and video chats.
  • Purchase History: A complete view of all purchases made within the app.

It’s important to note that this tracking is limited to the past week. To view older activity, parents will need to refer to previous email summaries. This limitation raises questions about the effectiveness of these features for ongoing monitoring and could lead to parents missing crucial information if they don’t regularly check in.

Additional Safety Tools and Controls

Beyond tracking, the new Family Center offers several additional tools:

  • Reporting Alerts: Teens can now notify their parents when they’ve reported objectionable content.
  • Customizable Settings: Guardians can enable filters for sensitive content and control who can send direct messages (DMs) to their teen—either friends only or all server members.

These changes reflect feedback from parent groups and organizations such as the National Parent Teacher Association and the Digital Wellness Lab at Boston Children’s Hospital, according to Discord’s global head of product policy, Savannah Badalich. She emphasized the company’s goal to provide “more visibility and more control” for parents.

Ongoing Legal Challenges and Criticism

Despite the new features, Discord is facing a growing number of lawsuits alleging it has failed to protect young users. Haley McNamara, executive director and chief strategy officer of the National Center on Sexual Exploitation, criticized the rollout as placing “the burden exclusively on overwhelmed parents” without addressing Discord’s responsibility in creating a safe platform. The organization has previously listed Discord on its “Dirty Dozen” list, citing concerns about sexual exploitation.

A lawsuit filed earlier this year alleged that Discord and Roblox together created a “breeding ground for predators,” referencing an anonymous 11-year-old girl who was reportedly groomed and sexually exploited on both platforms. Dolman Law Group, representing multiple plaintiffs in separate lawsuits, has named Discord as a co-defendant in cases involving harassment, manipulation, and even suicide allegedly linked to online predators who used the app to communicate with vulnerable young people.

Matt Dolman, founder of Dolman Law Group, welcomed the new resources but also stated that they are “far too little and way too late” for past abuse survivors.

Balancing Privacy and Safety

Discord acknowledges the delicate balance between giving teens the privacy they desire and providing parents with the tools needed to ensure their safety. The company is proactively working to identify and flag potentially harmful content and accounts. They also maintain a strict policy prohibiting child sexual abuse material.

Ultimately, Discord hopes to foster open conversations between teens and their parents about online safety, providing guides and resources within the Family Center to facilitate these discussions. The rollout of these features represents a significant response to mounting concerns and legal challenges, although their long-term impact on user safety remains to be seen.