We currently host a range of Advisory Councils, Knowledge Sharing Groups, and Workstreams where our members collaborate on efforts to keep children safe online.
Get Involved In Groups | Slack Channel Cheat Sheet
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Each Advisory Council advises on the strategic roadmap for the pillar of work that contributes to the Tech Coalition's overall goals.
Chair: Kade Farrell (GoDaddy)
Frequency: Quarterly
This Advisory Council meets quarterly to promote, facilitate engagement with, and recommend funding of research to advance the understanding of the experiences and patterns of online CSEA and learn from effective efforts to prevent, deter, and combat it. It gives priority to research that leads to actionable insights, which can be utilized by practitioners both within and outside of industry.
Chair: Despina Papageorge (Cloudflare)
Frequency: Bi-monthly
This Advisory Council meets bi-monthly to drive accountability across industry through transparency reporting as a means to drive progress in the fight against CSAM.
Chair: Jess Lishak (TC)
This Advisory Council meets quarterly to provide insights and feedback on priorities related to communicating industry progress on combating online child sexual exploitation and abuse.
Co-Chairs: Amar Kamat (Yahoo), David Callies (Meta), and Orlando Cardoso (Sony)
This Advisory Council meets bi-monthly to accelerate the development and uptake of groundbreaking technology to support a multi-sector ecosystem to thwart child sexual exploitation and abuse.
Co-Chairs: Katie Witwer (Yahoo) and James Gatlin (Pinterest)
This Advisory Council meets quarterly to expedite the adoption of effective practices and improvement in members' online child safety programs through the facilitation of greater sharing among industry of information, expertise, and knowledge related to OCSEA content, behaviors, and trends.
Knowledge Sharing Groups and Workstreams allow closer collaboration on projects and topics to honing in on resources or outcomes that advance our mission.
Policy | Harm-Specific | Function-Based | Trust & Safety | Event Planning | Lantern Only
.
Policy
Chairs: Tim Lynch (Yahoo), Billy Easley (Reddit), and supported by Nicholas Wells (TC)
Frequency: Monthly
This group meets monthly for an ongoing discussion on current and upcoming regional legislative developments in the Americas.
Chairs: Mariko Lawson (Amazon), Smrithi Ramesh (Cloudflare), and supported by Nicholas Wells (TC)
This group meets monthly for an ongoing discussion on current and upcoming regional legislative developments in APAC.
Chairs: Julia Mozer (Google), Julie Guichard (Microsoft), and supported by Nicholas Wells (TC)
This group meets monthly for an ongoing discussion on current and upcoming regional legislative developments in the EU/EMEA.
Harm-Specific
Lead: Leah Treitman (Meta) and supported by Rita Fabi (TC)
Frequency: Bi-annually
This group meets twice annually to discuss the latest insights on financial sextortion. Previously, the group created a comprehensive toolkit on understanding and preventing financial sextortion.
Leads: Leah von Eichel-Streiber (EA), Abby Kurland (Meta), and supported by Amber Hawkes (TC)
Frequency: Bi-weekly
This group meets bi-weekly to explore trends in child sex trafficking and additional harms associated with live-streaming. Topics include practices and challenges in relation to policies, prevention, detection and enforcement/reporting, and product design and technology related to boosting safety in live-streaming and audio tools.
Lead: Alycia Little (Pinterest) and supported by Rita Fabi (TC)
This group meets bi-weekly to discuss policies, detection, enforcement, and prevention of content that sexualizes minors. Also identifies common practices for evaluating and actioning memes and edge cases, including bulk reporting.
Lead: Kay Chau (TC)
Frequency: As needed
This group meets as needed to discuss how to respond to and mitigate the impact of apps/software used to "nudify" images and videos.
Function-Based
Lead: Erin Wickersham (YouTube) and supported by Rita Fabi (TC)
This group meets monthly to discuss the process of gathering intelligence on potential offenders on platforms.
Lead: Sarah Mower (Yahoo) and supported by Rita Fabi (TC)
This group meets quarterly to discuss investigations, tooling considerations, and supplemental reporting.
Lead: Stephen Dufresne (Snap) and supported by Rita Fabi (TC)
This group meets quarterly to discuss ways to better communicate with law enforcement, share contacts, and determine best practices for LE escalations.
Leads: Gayle Argon (Dropbox), Rachel Haney (Meta), and supported by Nicholas Wells (TC) and Amber Hawkes (TC)
This group meets monthly to discuss general legal issues with regards to OCSEA.
Trust & Safety
Lead: Amber Hawkes (TC)
This group meets monthly to discuss age assurance, evaluating product risks, regulatory compliance, and more. The group has heard from 3rd party age assurance solutions in previous meetings, and members will be sharing implementation challenges and learnings around topics such as measuring effectiveness, scaling considerations, weaponization & circumvention trends, and more.
Leads: Rita Fabi (TC) and Amber Hawkes (TC)
This group meets monthly for members who operate in the domain space to discuss a child safety roadmap specific to that sector.
Leads: Paul Sanders (Sony), Juliamarie Dekle (Xbox), and supported by Rita Fabi (TC)
This group meets twice annually for members in or adjacent to the gaming sector to discuss emerging trends.
Lead: Rita Fabi (TC)
This group meets monthly to discuss testing opportunities and legal limitations around red teaming, training data, and tactics to mitigate harm at input and output.
Event Planning
Lantern Only
Lead: Ruth Dannehy (TC)
This group meets bi-monthly in a collaborative workstream with cloud hosting and domain providers to detect and track CSAM hosting across platforms, utilizing innovative detection methods and cross-platform intelligence sharing.
This group meets bi-monthly as a collaboration between social media platforms and financial institutions to improve detection and disruption of financially motivated sextortion through better signal sharing and pattern identification.
This group meets bi-monthly, bringing together gaming and adjacent messaging/live-streaming platforms to share intelligence and detect predatory actors grooming and enticing children across gaming services.
These groups have been discussed and approved by Tech Coalition and/or Lantern members.
Frequency: One-off
A discussion will be held in Spring 2026 to follow-up the February 10, 2026 discussion for companies navigating the intersections between OCSEA and non-consensual intimate imagery (NCII), particularly in the context of the Take It Down Act and Stop NCII’s hash database.
More information coming soon.
These groups have completed their planned discussions and successfully created resources or planned events.
This group considered how to set up an appeals process, legal requirements, case studies, and how to scale to handle increased volume. It was led by Kay Chau (TC).
This group convened to plan the 2025 APAC Industry Briefing.
This group convened to plan the EU Gen AI Briefing which occurred on Nov 21, 2024.
This group convened to plan the TC’s Financial Sextortion Multi-stakeholder Forum events.
This group assessed child safety risks on Gen AI products and discussed how to ensure mitigation buy in. It is now part of Gen AI: Content and Reporting Template. It was run by Rita Fabi (TC).
This group met as needed, and focused on creating a template for companies to use when reporting AI-Generated CSAM and other exploitative content to NCMEC. It was supported by Rita Fabi (TC).
This group discussed the legal implications and issues surrounding the advent of generative AI and child safety.
This group met quarterly for companies with a live Generative AI model to discuss training data, adversarial testing, and tactics to mitigate harm at input and output. It is now combined with the Gen AI Red Teaming group. Led by Rita Fabi (TC).
This group met to discuss testing opportunities and legal limitations around red teaming for OCSEA. It is now combined with the Gen AI Models group.
This group focused on grooming guidelines ended in June 2023, but still has a group Slack channel to discuss best practices, share insights, and more.
This one-time discussion held on February 10, 2026 was for companies navigating the intersections between OCSEA and non-consensual intimate imagery (NCII), particularly in the context of the Take It Down Act and Stop NCII’s hash database. A follow-up conversation will be held sometime in Spring 2026. Led by Rita Fabi (TC).
This group examined the specific harm type of monetization of content by adults (for example parents) that primarily features children. It was run by Rita Fabi (TC).
This group planned the TrustCon Invite-Only Connect Track.
This group convened to plan the UK Gen AI Briefing which occurred on May 9, 2024.
This group met to prepare for the TC's session on OCSEA measurability at the UN General Assembly in September 2025.
This group convened to plan the US Gen AI Briefing which occurred on December 11, 2023.