Potential Opportunities • Signals shared in the Lantern database may enable participants to more accurately identify exploitation or abuse of children on their platforms, leading to fewer erroneous actions. • Information deduced from Lantern Program signals may improve participants’ OCSEA moderation efforts. Insights about emerging trends, patterns of conduct, key markers that reliably indicate potential or actual child sexual abuse or exploitation and gray areas such as self-generated explicit imagery or minor-to-minor interactions may enable participants to improve their manual and automated moderation processes and further advance best practices in the 昀椀eld. Impact Factors • Severity: The scope of users that could be affected by potential adverse impacts arising from the actioning of signals is large, with the scale of impact ranging from serious to potentially life or liberty threatening. For example, users may be unable to access digital platforms necessary for their livelihood, or may be arrested, interrogated, or detained by law enforcement agencies as a result of erroneous actions. Adverse impacts arising from such actions may range from possibly remediable (e.g., via restoration of a user account that was inaccurately removed) to not remediable (e.g., where an arrest or loss of employ- ment has occurred). • Likelihood: The likelihood of adverse impacts related to the actioning of signals may be greater with the Lantern Program compared to individual company practices, and will vary depending on the participant taking action, the type of platform they have, its size, and the company’s content moderation resources. Large platforms focused on user- generated content and interactions may have higher volumes of content to moderate and therefore a higher likelihood of unsubstantiated signals and erroneous action. Better resourced companies may be more capable of conducting reviews of the signals shared in the Lantern database, verifying their accuracy before taking action, and mitigating risks in a timely manner. Additionally, the likelihood of unsubstantiated signals and erroneous participant action may be higher for underrepresented languages, dialects, or markets where participants have fewer content moderation resources. • Attribution: Although risks related to erroneous actions already exist within individual companies’ OCSEA moderation efforts, these risks may be ampli昀椀ed or exacerbated by the cross-platform nature of the Lantern Program. For instance, impacts may be compounded if users lose access to multiple platforms that serve social or economic purposes as a result of an unsubstantiated signal by one participant. In such cases where the impact is compounded due to signal sharing among platforms, the Tech Coalition may be more closely associated with the harm. • Leverage: The Tech Coalition has developed guidelines and conditions for participation in the Lantern Program that require participants to conduct independent veri昀椀cation of signals and identify violations on their own platform prior to taking any action. However, participants’ approach to content moderation and capacity for veri昀椀cation vary, and some participants may take enforcement action against users for violations that occur outside of their platform that may further amplify the potential adverse impacts associated with the actioning of signals. BSR TECH COALITION HUMAN RIGHTS IMPACT ASSESSMENT 36
