5.2 Human Rights Risk Factors • Signal sharing between companies may exacerbate certain human rights risks. When individual companies’ 昀椀ght OCSEA alone they are faced with risks to other human rights, such as those related to freedom of expression (e.g., through over-moderation of content) or privacy (e.g., through overbroad data sharing or monitoring). The severity and likelihood of these risks may increase with the cross-platform nature of the Lantern Program. For example, signal sharing may increase the likelihood of erroneous content removal or blocking of user accounts because signals are further decontextualized when shared across platforms (e.g., a company 昀氀ags all text exchanges with a certain keyword and uploads them to the database; these signals are then interpreted differently by the other participants as they are not able to determine the context of communications that took place on another platform). This risk of exacerbating human rights impacts is more likely for content and behaviors that have less clear de昀椀nitional boundaries, such as grooming and sextortion, rather than CSAM, which has clearer de昀椀nitions. • There may be cumulative impacts associated with cross-platform signal sharing. The human rights impacts of company action to 昀椀ght OCSEA can be assessed at two different levels: (1) individual company-level impacts, and (2) cumulative impacts arising from, or exacerbated by, the actions taken by more than one company. Due to its cross-platform nature, the Lantern Program may lead to multiple companies taking action on signals in a way that results in compounded adverse impacts on human rights. For example, if multiple companies take action on unsubstantiated signals, users may be wrongly denied access to online services across multiple platforms. The Tech Coalition will seek to mitigate this risk by requiring companies to make their own independent decisions in accordance with their own policies and understanding of their legal obligations. However, there remains a risk that participants may ignore this requirement and action signals based on other companies’ decisions without suf昀椀cient due diligence on their own part. • Human rights risks may arise from unsubstantiated signals. Companies may mistakenly identify users or content as potentially harmful and share related “unsubstantiated signals” on the Lantern Program database. Further, companies may take subsequent action to erroneously remove content or shut down a user’s account based on unsubstantiated signals. While the risk of taking action on unsubstantiated signals already exists within individual company practices, its likelihood may increase with cross-platform signal sharing because signals are further decontextualized when shared across platforms. Similarly, the resulting human rights impacts may be more severe when multiple companies take action—for example, if users are wrongly denied access to online services across multiple platforms. • Government requests or involvement with the Lantern Program may be associated with human rights risk. Although the Tech Coalition is not planning to directly engage with governments, governments or law enforcement agencies may gain access to signals in the database by engaging with individual companies. Participating companies often have direct interaction with law enforcement agencies and may receive legal requests, or may otherwise be encouraged or coerced into sharing intelligence or signals, depending on the level of control governments exercise over technology companies in different countries. This may impact users’ right to privacy and lead to surveillance and threats to bodily security or due process. Governments or law enforcement agencies may also make requests to the Tech Coalition or participants of the Lantern Program to share user data. Where such requests are overly broad or BSR TECH COALITION HUMAN RIGHTS IMPACT ASSESSMENT 25
