• Signal sharing between companies may exacerbate certain human rights risks that exist within individual company efforts to 昀椀ght OCSEA, such as those related to freedom of expression (e.g., through over-moderation of content) or privacy (e.g., through overbroad data sharing or moni- toring). The cross-platform nature of the Lantern Program may also lead to cumulative impacts, as a result of multiple companies taking action on signals in ways that compound adverse impacts on human rights. • Human rights risks and/or challenges may arise due to a variety or factors, including: › Unsubstantiated signals (i.e., when companies mistakenly identify users or content as potentially harmful and share related signals on the Lantern database); › Government requests or involvement with the Lantern Program; › The lack of clear or standardized de昀椀nitions for some types of OCSEA-related behaviors, such as grooming, and the related risk of scope creep; › Differences in company policies or enforcement processes for handling edge cases (e.g., legal but harmful content) or gray areas (e.g., self-generated explicit imagery or peer-to-peer offenses); › Challenges related to age determination, assurance, and veri昀椀cation; › The development of new technologies such as diffusion models and generative AI that enable the creation and distribution of synthetic child sexual abuse material (CSAM). • Children’s interaction with the digital environment and associated impacts on their digital rights are constantly evolving. For example, their experiences with and attitudes toward self-generated explicit imagery are shifting rapidly and there is a lack of in-depth research in the 昀椀eld on how such online behaviors may affect children’s development and digital rights. This may create challenges for online platforms to proportionately address the risks and empower children to enjoy their rights in the digital environment. • Participant engagement with the Lantern Program varies signi昀椀cantly, mainly due to resourcing constraints, such as limited content moderation capacity, as well as philosophical differences between companies’ content governance approaches. For example, while most participants conduct manual review and veri昀椀cation of signals shared via the Lantern database, others may automatically action signals without verifying the violating content or conduct on their platform. • The Lantern Program can serve as a key resource for smaller technology companies with limited capacity to identify and investigate all OCSEA risks or harms occurring on their platform, by facilitating cross-industry knowledge sharing. While signal sharing can help companies prioritize content for review, a company still needs dedicated resources to action the signals and to effec- tively use the Lantern Program. • Legal frameworks may complicate the Lantern Program’s ethos of voluntary action. Although the Lantern Program is a voluntary program under which participants are not obligated to carry out any moderation activities, legal frameworks may create pressure on participants to take action on all signals received, which could deter participants from fully engaging in the Lantern Program or lead them to automatically action ingested signals. • Accountability and responsibility related to the Lantern Program is shared between the Tech Coalition and participating companies. Each participating company is responsible for how they share and use signals as part of the program, and is individually accountable for risks associated with their use of the signals. The Tech Coalition, on the other hand, is responsible for putting BSR TECH COALITION HUMAN RIGHTS IMPACT ASSESSMENT 5

Tech Coalition Human Rights Impact Assessment of the Lantern Program - Page 5 Tech Coalition Human Rights Impact Assessment of the Lantern Program Page 4 Page 6