such as general-purpose tools openly available for purchase online. To institute gating approaches at scale, companies may want to consider two options: i. Gating undertaken by the SaaS pr ovider before a sale: This could include assessments of public human rights commitments, company human rights rating and rankings, nature of company ownership, countries of concern, high-risk application areas, and nature of the intended use of the SaaS product or service. This approach could involve an escalation process for potentially concerning sales. It would be most suited to situa- tions in which the SaaS provider is delivering customized solutions for customers. ii. Customer -led gating process before a sale: This could include requirements that the customer would need to fulfill before obtaining rights to use the product or service, such as agreeing to service-specific terms, undertaking human rights training, agreeing to implement an opt-out or opt-in process, and/or establishing a reporting channel. 2. Technology and Design Choice Establish technical limitations to SaaS service functionality to restrict how it can be used and/or the addition of features or customizations that have the potential for adverse impacts. Companies can begin to do this by taking the following actions: Assess Data Sets and Data Sources to Avoid or Reduce Bias Conduct systematic reviews and vet training data as part of the due diligence process. This includes reviewing data sets for adherence to internal responsible data principles, guide- lines, or best practices from external sources. When relevant, internal processes and procedures should also include checks on whether proposed data sources for AI solutions are reputable and not affiliated with political or special interest groups; assessments on whether the data is representative of vulnerable or marginalized populations; use of techniques that train models evenly despite the use of biased datasets by re-weighting the data; creation of and adherence to guidelines on unbi- ased data selection for employees. Adversarial Testing Require parties involved in the design, development, and use of the SaaS product or service to conduct stress testing or adversarial testing to identify potential technological errors, ways in which the service could be misused or abused, or ways in which the use case could result in adverse human rights impacts. Consider bringing in external experts and stakeholders to assist with this process. This could also be conducted in collaboration with customers as part of the product develop- ment or proof of concept process for customized platforms, services, or products. SaaS providers can integrate futures methodology and strategic foresight into adver- sarial tests to consider how SaaS services and the context in which they are deployed may change and evolve over time. It is important to consider a wide range of potential impacts that might arise in the future, and not be constrained by human rights impacts that are well known today. 37 Human Rights Assessment of the Software-as-a-Service Sector

Human Rights Assessment of the Software-as-a-Service Sector - Page 38 Human Rights Assessment of the Software-as-a-Service Sector Page 37 Page 39