Non-Confidential Version APPLE DISTRIBUTION INTERNATIONAL LIMITED App Store – Report on Risk Assessment and Risk Mitigation Measures pursuant to Articles 33, 34 and 35 of Regulation (EU) 2022/2065 of the EuropeanParliament and of the Council of 19 October 2022 on a Single Market forDigital Services and amending Directive 2000/31/EC (Digital Services Act) 28 August 2023 Non-Confidential Version 2 App Store – Report on Risk Assessment and Risk Mitigation Measures OVERVIEW This Risk Assessment Report is structured as follows: Section 1 provides an overview of the Commission’s decision to designate the five App Storesas a Very Large Online Platform under the Digital Services Act, an overview of the role of AppleDistribution International Limited, and a summary of the Article 34 and Article 35 DSAobligations. Section 2 provides an overview of the methodology and steps taken by Apple to carry out therisk assessment, which included discussions with relevant functions within Apple, documentaryresearch and review, controls mapping, the approach to risk assessment and consideration ofexisting App Store risk mitigation measures, and preservation of documents. Section 3 provides an overview of relevant Apple ecosystem (i.e. not App Store specific)functions, policies and protections. These apply to the use of all Apple devices, before, orindeed regardless of whether, a user engages with the App Store. This Section details practicesand features relevant to user privacy, the protection of minors, Apple’s approach to humanrights, and its financial crime risk mitigation measures. These features and practices do notform part of the design, function or use of the App Store specifically, but nevertheless formpart of Apple’s approach to providing safe and trusted products and services to its customers.These practices are therefore described as background and wider context for the assessmentof the risk and mitigation measures linked to the App Store’s design, function and use. ThisSection 3 and Section 4 of the Report are included to assist those unfamiliar with key Appleecosystem policies and practices and the operation of the App Store. Section 4 provides an overview of the risk profile of the App Store, as the VLOP regulatedunder the DSA, both from the perspective of an end user, as well as a developer perspective.This Section is intended to provide the reader with a high-level overview of the operation ofthe App Store and key risk mitigation measures, to assist in understanding the assessmentcontained in Section 5 as to how the Systemic Risks might stem from the design, function oruse of the App Store. As described below, a fuller explanation of the risk mitigation measuresdeployed in connection with the App Store is set out at Section 6. Section 5 sets out Apple’s assessment of whether and how any of the Article 34 Systemic Risksstem from the design, function or use of the App Store. Consistent with the Commission’sexpectations, Apple has, to the extent possible, sought to assess how each of the SystemicRisks could arise in principle from the design, function or use of the App Store, withoutreference to the extensive risk mitigation measures that are in place. This is notwithstandingthe fact that, as many of those risk mitigation measures have been in place, and have beencontinuously enhanced, since the App Store’s introduction 15 years ago, they are now deeplyintegrated in the App Store, rendering the assessment of Systemic Risk without reference tothose measures potentially artificial in places. Section 6 then provides detailed information on the App Store Article 35 risk mitigationmeasures that are relevant to the Systemic Risks identified in Section 5. This includes asummary of key terms and conditions, including the App Store Review Guidelines, for both Non-Confidential Version 3 developers and users, a detailed description of the App Review process, a summary of keyfunctions involved in escalations of concerns or incidents, a summary of monitoring thatcontinues after apps are published on the App Store, and detailed information on App Storenotice and action procedures which enable third parties to alert Apple to concerns regardingcontent on the App Store as well as problematic content in live apps, and descriptions ofrelevant recommender systems. Section 7 concludes with an analysis of the reasonableness, proportionality and effectivenessof Apple’s risk mitigation measures, as detailed in Section 6, in managing the Systemic Risksto the extent that they may stem from the design, function or use of the App Store, asidentified in Section 5. \*\*\* \*\*\* \*\*\* \*\*\* \*\*\* This Report was prepared solely for transmission to the European Commission, pursuant toArticle 42(4)(a) and (b) of the DSA, and upon designation, the Comisiún na Meán. The reportis confidential and contains commercially sensitive information. It cannot be disclosed underRegulation 1049/2001 as this would undermine Apple’s commercial interests, including itsintellectual property. For the sake of completeness, Apple intends to publish a non-confidential version of the Report, in accordance with Article 42(4), following receipt of theaudit report pursuant to Article 37(4). Non-Confidential Version 4 Section 1: VLOP designation and risk assessment and risk mitigationobligations ................................................................................................................................................. 5 Section 2: Risk assessment and risk mitigation methodology ................................................ 8 Section 3: Apple ecosystem functions, policies and protections..........................................12 Section 4: The risk profile of the App Store .................................................................................21 Section 5: Potential systemic risks arising from the design, functioning or useof the App Store ....................................................................................................................................32 Section 6: Mitigation of potential systemic risks arising from the design,functioning or use of the App Store...............................................................................................49 Section 7: Reasonableness, proportionality and effectiveness of app store risk-mitigation measures.............................................................................................................................83 Schedule A: Relevant teams and functions referenced in the report .............................. 104 Non-Confidential Version 5 SECTION 1: VLOP DESIGNATION AND RISK ASSESSMENT AND RISK MITIGATIONOBLIGATIONS 1.1 Section overview 1.1.1 This Section of the Report provides an overview of the designation of the App Storeas a Very Large Online Platform (“VLOP”), an overview of the role of Apple DistributionInternational Limited, and the Digital Services Act (the “DSA”) risk assessment and riskmitigation obligations. 1.2 Article 33(4) VLOP designation 1.2.1 On 26 April 2023, pursuant to Commission Decision C(2023) 2726 final of 25 April2023, the European Commission (the “Commission”) notified Apple DistributionInternational Limited (hereafter, “ADI”) that it had designated Apple’s “App Store” asa VLOP in accordance with Article 33(4) of the DSA. Article 1 of the Decision describes“App Store” as “App Store, consisting of iOS App Store, iPadOS App Store, watchOSApp Store, macOS App Store, and tvOS App Store”. 1 1.3 The role of ADI 1.3.1 ADI is a company registered in Ireland and ultimately owned by Apple Inc. ADI isresponsible for the provision of the App Store across the European Union (“EU”), andis therefore the “provider” of the VLOP service (i.e. the App Store) for the purposes ofthe DSA. 1.3.2 As such, ADI’s board of directors (the “ADI Board”) is the “management body of theprovider of the [App Store] ” that is responsible for the “sound management ofsystemic risks identified pursuant to Article 34 ”, as required by Article 41 of the DSA. 1.3.3 Although ADI is responsible for the provision of the App Store in the EU, and fordetermining the purposes and means of processing personal data in the context ofthis provision, and considering that ADI personnel contribute to the policies,processes and procedures relevant to the provision of the App Store in the EU andglobally, for the purposes of this Risk Assessment, and unless otherwise stated, we donot distinguish between ADI and Apple Inc. Instead, we refer to “Apple” policies,processes and procedures, without prejudice to which entity is providing the actualservice or product being discussed. 1 ADI does not accept that iOS App Store, iPadOS App Store, watchOS App Store, macOS App Store,and tvOS App Store all form part of a single online platform. ADI considers these services to beseparate online platforms, which have significant material differences from both a developer andend user perspective. ADI considers that only iOS App Store should have been designated as aVLOP. Nonetheless, in the light of the definition of App Store in the Commission’s decision, ADIhas prepared this Report on the basis that it extends to iOS App Store, iPadOS App Store, watchOSApp Store, macOS App Store, and tvOS App Store. We refer to the “App Store” as referring to allof those services. Non-Confidential Version 6 1.4 Overview of Articles 34 and 35 obligations on Risk Assessment and Risk Mitigation 1.4.1 Article 34 of the DSA requires each VLOP provider to identify, analyse and assess any“...systemic risks in the [EU] stemming from the design or functioning of their serviceand its related systems, including algorithmic systems, or from the use made of theirservices.” 1.4.2 The risk assessment must be carried out within four months of the designation (i.e.by 28 August 2023) and at least once every year thereafter, and in any event prior todeploying functionalities that are likely to have a critical impact on the risks identifiedin Article 34. 1.4.3 The risk assessment must be specific to the VLOP’s service and proportionate to thesystemic risks, taking into consideration their severity and probability. This recognisesthat different types of online platforms that have been designated as VLOPs will havedifferent risk profiles. 1.4.4 Pursuant to Article 34, systemic risks within the EU include: (a) “dissemination of illegal content;” (b) “any actual or foreseeable negative effects for the exercise of fundamentalrights...;” (c) “any actual or foreseeable negative effects on civic discourse, electoralprocesses, and public security; ” and (d) “any actual or foreseeable negative effects in relation to gender-based violence,the protection of public health and minors and serious negative consequencesto the person’s physical and mental well-being ”, hereafter referred to as the “Systemic Risks”. Section 5 below includes further detailon each of them. 1.4.5 When conducting the Article 34 risk assessment, VLOPs must take particular accountof whether and how the following factors influence the Systemic Risks: (a) “the design of their recommender systems and any other relevant algorithmicsystem;” (b) “their content moderation systems;” (c) “the applicable terms and conditions and their enforcement;” (d) “systems for selecting and presenting advertisements;” and (e) “data related practices of the provider”. 1.4.6 In addition, VLOPs must “analyse whether and how the [Systemic Risks] are influencedby intentional manipulation of their service, including by inauthentic use orautomated exploitation of the service, as well as the amplification and potentiallyrapid and wide dissemination of illegal content and of information that is Non-Confidential Version 7 incompatible with their terms and conditions.” They must also “take into accountspecific regional or linguistic aspects, including when specific to a Member State.” 1.4.7 Further, Article 35 of the DSA requires providers of VLOPs to put in place “reasonable,proportionate and effective mitigation measures, tailored to the risks ” identifiedthrough the risk assessment carried out pursuant to Article 34. 1.5 This Report 1.5.1 This Report on Risk Assessment and Risk Mitigation Measures (the “Report” or “RiskAssessment”) details the risk assessment conducted by Apple pursuant to Article 34,which includes consideration of existing controls that are already in place to keep theApp Store a safe and trusted place for users, as well as any specific mitigationmeasures identified pursuant to Article 35 to address any Systemic Risks. This Reportreflects the position as at the date of finalisation of the report, 28 August 2023. 1.5.2 A schedule detailing teams and functions referred to in the Report is provided atSchedule A. 1.5.3 Pursuant to Article 37 of the DSA, amongst other DSA obligations, Apple’s compliancewith Articles 34 and 35 will be subject to independent audit. Apple’s subsequent riskassessments, including for year 2024 onwards, will factor in any feedback from itsauditors, as well as any feedback from or guidance published by the Commission. Non-Confidential Version 8 SECTION 2: RISK ASSESSMENT AND RISK MITIGATION METHODOLOGY 2.1 Section overview 2.1.1 This Section of the Report details the steps taken by Apple to comply with itsobligations under Articles 34 and 35 of the DSA. 2.2 Risk Assessment coordination and key responsibilities 2.2.1 This Risk Assessment was coordinated by the App Store Legal team, a dedicated teamof inhouse counsel who have primary responsibility for all legal and regulatory issuesrelevant to the App Store, in collaboration with Apple Privacy Compliance Legal, EURegulatory Legal, Services Special Programs and other relevant teams and Applestakeholders. 2.2.2 EU external counsel at Gibson, Dunn \& Crutcher LLP was engaged for the purposesof assisting the App Store Legal team in connection with Apple’s conduct of the RiskAssessment, including its consideration of the reasonableness, proportionality andeffectiveness of the existing App Store mitigation measures which are relevant to theSystemic Risks. 2.2.3 The risk assessment was conducted in parallel with Apple’s work to implement newprocesses and controls to fulfil Apple’s obligations under the DSA, including theestablishment of a DSA compliance function, reporting to the ADI Board. Whererelevant, new controls and processes are factored into Apple’s assessment of whetherit has in place reasonable, proportionate and effective mitigation measures to addressany Systemic Risks stemming from the design, functionality or use of the App Store. 2.3 Identification of key relevant stakeholders and controls mapping 2.3.1 In 2022, having identified the App Store as a service likely to be designated as a VLOPpursuant to Article 33(1) and (4) of the DSA, Apple commenced a review of therelevant existing control framework and the extent to which those controls addresspotential Systemic Risks. 2.3.2 Apple identified key relevant stakeholders that would need to be consulted at theoutset of this process, in order to map the relevant processes and workflow carriedout by each team, including at different stages of an app’s lifecycle. This scopingassessment also considered applicable terms and conditions, enforcement of AppStore Review Guidelines (the “Guidelines”), escalation intake and triage mechanisms,moderation of App Store-hosted user-generated content (“UGC”), and other controls,policies, and procedures relevant to the App Store. 2.3.3 The App Store Legal team collaborates with the impacted teams on a routine basis inrelation to the management and mitigation of risk within the App Store, and reviews,authors, and updates key App Store policies, including the Guidelines, the AppleDeveloper Program License Agreement (“DPLA”) and related Schedules, as well asApp Store-related provisions of the Apple Media Service Terms and Conditions (“AMSTerms”). Based on these prior engagements and initial scoping activities, the AppStore Legal team identified relevant teams and senior employees, including those Non-Confidential Version 9 responsible for App Review, Recommender Systems, Global Security Investigations,Privacy Legal and Privacy Compliance, Human Rights, and Trust and Safety Operations,to be consulted in the preparation of this Risk Assessment. 2.4 Scoping discussions 2.4.1 Apple conducted a series of scoping discussions with key stakeholders, in order tobetter understand the key relevant App Store operational processes and proceduresand related controls. 2.4.2 The issues addressed in those meetings extended to: (a) the role of each team in mitigating potential risks relating to the App Store; (b) the functioning and operation of each team and the ways in which they interactwith and rely upon the work of other teams within the App Store; (c) key stages in the app lifecycle at which Systemic Risks may require mitigation; (d) the risk mitigation measures in operation to keep the App Store a safe andtrusted place for all users, including in relation to illegal content, disinformationand fraud; (e) the extent to which the design, functionality or use of the App Store could giverise to Systemic Risks; (f) the operation of any recommender systems, and the use of any algorithmicsystems; (g) the additional risk mitigation measures in operation to further enhance theprotection of minors; (h) internal and external escalation mechanisms and investigation procedures; (i) the procedures in place within each team to monitor and analyse trends arisingfrom the management and mitigation of risks; (j) the frequency with which procedures or controls relating to each team arereviewed; and (k) the effectiveness of relevant risk mitigation measures in addressing key areasof risk for the App Store. 2.5 Consideration of external commentary (government, NGO, trade bodies and interest groups, press, developer and consumer) on extent to which the Systemic Risks stem from the design, functionality or use of the App Store 2.5.1 Senior personnel within each function in the App Store (and who were consulted inconnection with this risk assessment) are highly attuned to current events andexternal commentary affecting the App Store and their functions in particular. Theytake account of such events and commentary in making ongoing improvements torisk mitigation measures that they are responsible for. This includes commentaryfrom government bodies, NGOs, relevant trade bodies and interest groups, as well as Non-Confidential Version 10 the press. They are also alive to and responsible for considering concerns raised bythe extensive App Store developer community and its users. Such concerns andissues have been considered as part of this Risk Assessment. 2.5.2 Discussions were held with Apple functions that interact with external parties,including government agencies and human rights organisations, in order tounderstand any views raised about the App Store. 2.6 Assessment and identification of any Systemic Risks 2.6.1 Apple then assessed the extent to which any potential Systemic Risks stem from thedesign, functioning or use of the App Store, including by reference to the factorslisted in Article 34(2) of the DSA. 2.7 Desktop review of documentation of relevant risk mitigation measures 2.7.1 Following the initial scoping discussions referred to above, and with the assistance ofpersonnel within the App Review team, Recommender Systems, Trust and SafetyOperations, and other functions, Apple gathered relevant documentation, buildingupon the understanding of the systems, controls, decision-making, andcommunication structures within the App Store critical to the management of risks,in particular those relating to the enforcement of the Guidelines through automatedand human review of new app and app update submissions, internalescalation/external report intake procedures on app and App Store hosted content,and related supervision, oversight and monitoring. The applicable provisions withinthe Guidelines, the DPLA and other pertinent documentation are addressed furtherbelow. 2.8 Consideration of data and documentation relevant to assessment of the effectiveness of existing risk mitigation measures 2.8.1 Documentation relevant to the assessment of the effectiveness of existing controlswas gathered and reviewed. 2.9 Assessment of reasonableness, proportionality and effectiveness of existing controls in the light of the foregoing 2.9.1 As explained in further detail below, Apple already had in place extensive controls tokeep the App Store a safe place for all users. Taking these into account, Appleconducted an assessment of whether those controls, as well as additional controlsbeing implemented in connection with the DSA, constitute reasonable, proportionateand effective risk mitigation measures (factoring in the severity and probability of theSystemic Risks identified earlier in the risk assessment process). 2.10 Approach to preservation of documents 2.10.1 Pursuant to Article 34(3), VLOPs must preserve the supporting documents of this RiskAssessment for at least three years. 2.10.2 To comply with this obligation, Apple has retained all documentation obtained fromvarious functions and subsequently reviewed as part of this risk assessment. Apple Non-Confidential Version 11 has also retained as documents “screen grabs” of any relevant information that iscurrently available online, which may be changed in the normal course of business.All such documentation will be preserved in accordance with Article 34(3) of the DSA. Non-Confidential Version 12 SECTION 3: APPLE ECOSYSTEM FUNCTIONS, POLICIES AND PROTECTIONS 3.1 Section overview 3.1.1 This Section of the Report details certain relevant Apple-level (i.e. non-App Storespecific) functions, policies and practices that apply to all of Apple’s products andservices across the wider Apple ecosystem. 3.1.2 These protections apply to the use of all Apple devices, regardless of whether a userengages with the App Store, and, while not forming part of the design or function ofthe App Store itself, and the provision of the App Store by Apple, they contribute tothe overall risk environment in which the App Store operates. These protections arenot limited to, but extend to, Apple in relation to its provision of the App Store. 3.2 Privacy and personal data 3.2.1 Apple recognises that privacy is a fundamental human right. It is also one of Apple’score values.(a) Privacy by Design 3.2.2 Apple designs its products and services according to the principle of “privacy bydesign”. Apple is widely recognised, including by industry and data protectionexperts, as setting the industry standard for minimising personal data collection.Apple builds privacy protections into everything it makes, including the devices andoperating systems on which the App Store is designed to be used and its relatedprocesses, including the comprehensive App Review process, detailed further inSections 4 and 6 below. 3.2.3 Apple deploys industry-leading user control mechanisms to allow its customers tochoose whether to share data such as their Location, Contacts, Microphone, Camera,Health information, and more with apps. In addition, powerful security features helpprevent anyone except the individual user from being able to access their owninformation. 3.2.4 When Apple does collect personal data, Apple retains it for only so long as necessaryto fulfil the purposes for which it was collected, including as described in Apple’sPrivacy Policy2 or in Apple’s service-specific privacy notices, or as long as required bylaw. 3.2.5 Privacy is a foundational part of the design process. Apple is constantly working onnew ways to keep users’ personal information safe and protect their privacy. Apple’sPrivacy Engineering team ensures that privacy protections are incorporatedthroughout Apple products, apps, and services. Apple’s “Feature” page provides anoverview of privacy features embedded in its apps and services, including the AppStore.3 2 https://www.apple.com/legal/privacy/en-ww/3 https://www.apple.com/privacy/features/ Non-Confidential Version 13 3.2.6 Apple believes that users can have great products and great privacy. Five principlesare at the core of how Apple achieves this goal: (i) Data minimisation 3.2.7 Apple’s approach is to collect only the personal data required to deliver what usersneed. In instances where specific personal information is necessary, Apple minimisesthe amount of data that it uses to provide the intended service – such as a user’sdevice location when searching in Maps. Apple does not maintain a comprehensivedata profile of user activity across all its products and services to serve targetedadvertising. (ii) On-device intelligence 3.2.8 Apple uses machine learning to enhance user experience – and user privacy – byprocessing some user data on-device so that third parties, including Apple, do notsee and have no access to user data. In those instances in which user data may besent to Apple or third parties, Apple provides end users with choices, includingtransparent control mechanisms. Apple has used on-device processing for on-deviceimage and scene recognition in Photos, predictive text in keyboards, and more.Developers can use Apple’s frameworks, such as Create ML and Core ML, to createpowerful new app experiences that do not require user data to leave their device.That means apps can analyse user sentiment, classify scenes, translate text, recognisehandwriting, predict text, tag music, and more without putting privacy at risk. (iii) Transparency and control 3.2.9 When Apple does collect personal data, Apple is clear and transparent about it. Applemakes sure users know how their personal data is being used, and, if applicable, howto opt out at any time. Data and privacy information screens help Apple usersunderstand how Apple will use personal data before users sign in or start using theservice or any new features. To ensure that Apple is meeting its own high standardsfor protecting user data and privacy, Apple has conducted a comprehensive reviewof its services, products and features that collect and/or hold a user’s data. Thisinformation is available in Apple’s “Data \& Privacy summaries”, which are publishedon its website. 4 3.2.10 Apple also provides a set of dedicated privacy management tools on Apple’s “Data \&Privacy” page.5 This complete set of self-service tools includes options for users withan Apple ID 6 to: (i) get a copy of the data that they store with Apple that is associatedwith their Apple ID; (ii) transfer a copy of their data to another participating service;(iii) deactivate their Apple ID temporarily; (iv) delete their Apple ID – and the data 4 https://www.apple.com/legal/privacy/data/5 https://privacy.apple.com/6 An Apple ID is the account a customer uses to access all Apple services and to make their deviceswork together. When creating an Apple ID a user must provide their full name, date of birth, andan email address or phone number. Additional detail for users is available here:https://support.apple.com/apple-id Non-Confidential Version 14 associated with it – permanently; (v) request a correction to their personal data; and(vi) find out about the types of data that Apple collects. (iv) Protecting user identity 3.2.11 Apple has developed technologies to enable users to obscure their identity when data must be transferred to Apple servers. Sometimes Apple uses random identifiers so auser’s data is not associated with their Apple ID. Apple has pioneered the use of“Differential Privacy” to understand patterns of behaviour while protecting anindividual user’s privacy.7 By way of example, if a user chooses to send Appleanalytics about their device usage, the collected information does not identify thempersonally. In such cases, to ensure that no personal data is being shared, Applerandomly generates device identifiers that cannot be traced back to a piece ofhardware, a customer, nor any identifier in any other data source at Apple. In addition,for particularly sensitive data, Apple applies further de-identification techniques on-device to further reduce any remaining risks of fingerprinting by using techniquessuch as “Differential Privacy” or by omitting content. Techniques like these help Appledeliver and improve services while protecting users’ privacy. (v) Data security 3.2.12 Security is at the core of how Apple has designed its operating systems, products andservices. Every Apple device combines hardware, software, and services designed towork together for maximum security and a transparent user experience. Customhardware – such as the Secure Enclave, a dedicated secure subsystem, in iPhone, iPad,and Mac, which is isolated from the main processor – powers critical security featureslike data encryption. 8 Software protections work to help keep the operating systemand third-party apps safe. Services provide a mechanism for secure and timelysoftware updates; to power a safer app ecosystem, secure communications, andpayments; and to provide users a safer experience on the web. Apple devices helpprotect not only the device and the data stored therein, but the entire ecosystem,including what users do locally, on networks, and with key web services. Appledevices also have encryption features to safeguard user data and enable a remotewipe in the case of device theft or loss.(b) Privacy Governance 3.2.13 Apple takes a cross-functional approach to privacy governance. Privacy governancecovers all areas of the company and covers both customer and employee data. TheVice President in charge of Privacy and Law Enforcement Compliance reports directlyto Apple’s General Counsel. Apple also has a dedicated Privacy Engineering teamthat partners with the Privacy Legal team and dedicated product counsel to designproducts from the ground up, in a way that protects customer privacy and to ensurethat Apple protects any data that is within Apple’s control. This includes strong 7 https://www.apple.com/euro/privacy/e/generic/docs/Differential Privacy Overview.pdf8 https://support.apple.com/en-gb/guide/security/secf020d1074/1/web/1 Non-Confidential Version 15 processes involving Apple’s Data Protection Officer, notably around ensuring thatdata is collected lawfully and is used only for the intended lawful purposes. 3.2.14 Apple also has a Privacy Steering Committee chaired by Apple’s General Counsel, withmembers including Apple’s Senior Vice President of Machine Learning and AIStrategy and a cross-functional group of senior representatives. 9 The Privacy SteeringCommittee sets privacy standards for teams across Apple and acts as an escalationpoint for addressing privacy compliance issues for decision or further escalation. 3.2.15 Apple regularly engages with a wide range of civil society representatives globally onvarious privacy and freedom of expression issues, including privacy by design andencryption. 3.2.16 Apple maintains current ISO 27001 and 27018 certifications. ISO 27001 is aninternational standard for implementing, managing and maintaining informationsecurity within a company. ISO 27018 is an international standard for the protectionof personally identifiable information in public clouds. To maintain thesecertifications, Apple is subject to annual audits. 3.2.17 Apple’s “Privacy Governance” page provides more details on Apple’s approach toprivacy governance, including the oversight and monitoring of privacy and datasecurity, the privacy training that all Apple employees are required to take, Apple’sdata security and incident response, and how Apple handles privacy complaints andprivate requests for user information.10 (c)Privacy Impact Assessments 3.2.18 As part of Apple’s commitment to privacy and other human rights, trained reviewersundertake Privacy Impact Assessments (“PIAs”) for Apple’s major products andservices. PIAs are conducted when Apple is developing new products, services orfeatures. Teams responsible for the development of such products or services mustdescribe in detail how personal data will be processed, and the purposes, retentionperiods and other processing details. Reviews include assessments of how a productor service processes personal data, the necessity and proportionality of suchprocessing, the risks or impact that any such processing has on individuals and theirrights, and the mitigating controls implemented to address such risks or impact. PIAsare approved by Apple’s Data Protection Officer. 3.3 Advertising and privacy 3.3.1 Apple’s Advertising \& Privacy service-specific privacy notice describes Apple’s data-related practices in relation to its advertising activities and app promotion options,including Apple Search Ads on the App Store (described in Section 4 below). 9 The Privacy Steering Committee consists of Apple’s General Counsel as well as seniorrepresentatives from Internet Software and Services, Software Engineering, Product Marketing,Corporate Communications, Information Services \& Technology, Information Security, PrivacyLegal and the Head of Business Assurance.10 https://www.apple.com/legal/privacy/en-ww/governance/ Non-Confidential Version 16 3.3.2 The notice sets out how Apple’s advertising platform is designed to protect users’privacy and give them control over how Apple uses their information. The Policystates at the outset that “[Apple’s] advertising platform doesn’t share personal datawith third parties. ” Apple’s advertising platform does not track any user, meaningthat it does not link user or device data collected by Apple with user or device datacollected from third parties for targeted advertising or advertising measurementpurposes, and does not share user or device data with data brokers. 3.3.3 The notice provides detailed information about the minimal data that could be usedto personalise Apple Search Ads to users, as well as information about how they canturn ad Personalisation on or off. Further information on ad Personalisation is set outin Section 6 below. 3.4 Protections relating to children (a)Child safety 3.4.1 Apple knows that keeping children safe online is imperative and for that reason hascreated a number of features to help protect children and provide information toparents and guardians to improve children’s safety online. These include: (a) Child Account Set-Up; (b) Family Sharing; (c) Screen Time; and (d) Ask to Buy. 3.4.2 Child Account Set-Up. “Family Sharing” is an operating system-level feature that isaccessible in the Apple ID section of settings. Using Family Sharing, a family organisercan invite up to five other family members to join the family group and set upaccounts for users under 13 (or relevant age in their country or territory ofresidence). 11 When setting up an account for a child under 13 (or relevant age in theircountry or territory of residence), parents and guardians can choose to enable a rangeof parental controls to manage their child’s experience. Child users cannot create anApple ID themselves if they indicate that they are under 13 years of age (or relevantage in their country or territory of residence); all such accounts must be set up byparents via Family Sharing. 3.4.3 Family Sharing enables the safe use of Apple devices and products by families andchildren and allows parents to share access to Apple services. However, there maybe times when parents want to limit the child’s access to certain types of content orpurchases available to the rest of the Family. As noted above, if a user is below therelevant age then a parent must create the Apple ID for the child. 3.4.4 Screen Time provides parents and children an insight into the time the child isspending using apps, visiting websites, and on the device overall, and provides weekly 11 For residents in the EU, the relevant age is 13 (or the minimum age of lawful consent in the relevantjurisdiction in application of Article 8 of the General Data Protection Regulation (“GDPR”)). Non-Confidential Version 17 reports to help monitor device use. Parents can use Screen Time to better understandand make choices about how much time their children spend using apps and websites.Activity Reports give parents a detailed overview of their child’s app usage,notifications, and device pickups – and only they, their children, and those theychoose to share it with can view this information. Parents can choose to apply contentrestrictions, which restrict download of, for example apps or games with specific ageratings or categories of apps or games. They can also fully restrict the downloadingof some or all apps via Screen Time settings. 3.4.5 Further, through Screen Time, parents can set individual parental controls to restricttheir children’s Apple devices to limit the websites they visit, the types of movies andTV shows they watch, their access to FaceTime and Camera, and the types of musicand podcasts they can access, to prevent them encountering explicit content. All thiscan be password protected with a parental code. Parents can also place restrictionson privacy settings, such as for Location Services and Photos, so that their childrencannot change those settings themselves. Apple facilitates parents to makeexceptions for specific apps, like educational or mindfulness apps and even allowsparents to set specific times during the day when apps, notifications and certainfeatures are automatically blocked. Parents can also select which apps appear ontheir child’s device “home” screen. 3.4.6 Communication Limits allows parents to choose who their children arecommunicating with and when throughout the day, including during downtime, sochildren can always be reachable, whilst providing the knowledge and control to helpkeep them safe. 3.4.7 Ask to Buy allows parents to approve app downloads and purchases requested by thechild, including in-app purchases, on the App Store or otherwise using iTunes. It isenabled by default for any children under 13 and can be enabled for any familymember under 18 by the Family Organiser.12 3.4.8 If a child initiates a download or purchase on their device, parents receive a requestto approve it on their own device. If they chose to approve it, the App Store willcomplete the download or purchase on the child’s device. If they decline, the processstops there (i.e. App Store will not complete the download or purchase).(b) Child protection 3.4.9 Communication Safety is a parental control feature which provides warnings in theevent that a child receives or sends images containing nudity on iMessage. From thenext OS update (e.g. iOS 17), this feature will be enabled by default for all users under13 years of age or equivalent minimum age in their country or territory of residence.Parents can also enable the feature via Family Sharing for children under 18 years ofage. By using an on-device image classifier, the image is detected and blurred andthe child receives an alert along with helpful and age-appropriate resources and theoption to send a message to a trusted person for help. This feature is to be expanded 12 https://support.apple.com/en-us/HT201089 Non-Confidential Version 18 to Airdrop and FaceTime video messages, as well as PhotosPicker. It will also be madeavailable to developers for use on third-party messaging and communication appsby implementing the Sensitive Content Analysis framework (starting with iOS 17). 13End-to-end encryption is maintained and no one, including Apple, has access to themessages. 3.4.10 Expanded guidance in Siri and Safari search provides additional online safetyinformation and local resources, which includes information on how to report ChildSex Abuse Material (“CSAM”) or Child Sexual Exploitation and Abuse (“CSEA”) andhow to seek support and advice for situations which may arise online and offline(helplines and hotlines in each jurisdiction). Siri and search will also intervene in theevent that users perform searches for CSAM, explaining the dangerous and illegalnature of it and providing resources and links to partners who can provide help toprevent abuse. 3.4.11 Within Apple’s Global Security function, Apple employs dedicated Child SafetyCounsel. Child Safety Counsel works with other areas of the Apple business (includingthose specific to the App Store) relevant to child safety and contribute to policies andprocedures to keep children safe when they engage with Apple products and services.Child Safety Counsel is also responsible for investigating escalations from withinApple and third parties (including developers and users) relating to CSAM or CSEAmaterial, and, where necessary, reporting issues to law enforcement agencies.(c) Children and data 3.4.12 Apple understands the importance of safeguarding the personal data of children.That is why Apple has implemented additional processes and protections for children.If Apple learns that a child’s personal data was collected without appropriateauthorisation, it is deleted as soon as possible. 3.4.13 Apple maintains robust privacy protections as a basic requirement for all of its users,including children, ensuring the provision of strong safeguards to all childrenregardless of their age range or developmental stage. These high standards includedata-minimisation, on-device processing, transparency measures, and data securitytools. 3.4.14 Additional App Store-specific controls relevant to children are addressed in Section 6of this Report. 3.5 Human rights 3.5.1 Apple is committed to respecting human rights, including the right to privacy andfreedom of information and expression. Human rights are at the core of how Appletreats everyone – from its customers and teams to its business partners and peopleat every level of its supply chain. Apple reflected this commitment in its Human Rights 13 https://developer.apple.com/documentation/sensitivecontentanalysis Non-Confidential Version 19 policy, first published in 2020, 14 which states that Apple’s approach to human rightsissues is based on the UN Guiding Principles of Business and Human Rights. ThePolicy was adopted by Apple Inc.’s Board (the “Apple Inc. Board”), which is responsiblefor overseeing and periodically reviewing it. Apple’s Senior Vice President andGeneral Counsel oversees the implementation of this Policy and reports to the AppleInc. Board and its committees on progress and significant issues. 3.5.2 The Human Rights Policy touches on a number of issues that are relevant to Apple,including human rights considerations in the design and functioning of its productsand human rights risks in its supply chains, and explains that, in keeping with the UNGuiding Principles, where national law and international human rights standards differ,Apple follows the higher standard. Where they are in conflict, Apple respects nationallaw while seeking to respect the principles of internationally recognised human rights. 3.5.3 Apple has a dedicated Human Rights function that is responsible for conductinghuman rights due diligence across Apple, in order to identify human rights risksarising in connection with Apple’s business operations and to implement plans toprevent or mitigate such risks. It also works with different business groups to alignexisting processes with the Human Rights Policy framework. In addition, the teamissues human rights-related training content, which is delivered to Apple employeesaround the globe. 3.5.4 Apple identifies salient human rights risks through internal risk assessments. In somecases, it identifies issues via external industry-level third-party audits, as well asthrough the channels it maintains with rights holders and other stakeholders,including investors, human rights and labour experts, governments, and internationalbodies such as the UN. In addition to its own internal monitoring, Apple considersreports identifying potential risks from external sources, including internationalorganisations, policy makers, shareholders, civil society organisations, news outlets,customers, individuals in the supply chain or supply chain communities, whistleblowermechanisms, and third-party hotlines. 3.5.5 Based on this type of due diligence, by way of example, in 2022 Apple identified thefollowing human rights issues of particular focus (detailed in its 2022 EnvironmentalSocial Governance (“ESG”) Report): 15 (a) Privacy, freedom of expression and access to information risks; (b) Discrimination risks in workforce management and in product services anddevelopment; and (c) Labour and human rights risks in the supply chain. 3.5.6 More detail on Apple’s ongoing human rights efforts are detailed in the 2022 ESGReport. 14 https://s2.q4cdn.com/470004039/files/doc downloads/gov docs/2020/Apple-Human-Rights-Policy.pdf15 https://s2.q4cdn.com/470004039/files/doc downloads/2022/08/2022 Apple ESG Report.pdf Non-Confidential Version 20 3.6 Apple fraud prevention 3.6.1 Apple employs industry best practices to safeguard Apple customers and preventpotentially fraudulent transactions, across Apple Media Products platforms, includingfor example the App Store, Apple Music, and iCloud services. 3.6.2 Apple’s fraud mitigation tools include, but are not limited to, Two FactorAuthentication, Fraud Screening, Hostile Fraud Screening, First Party MisuseScreening, and Account Takeover Detection. 3.6.3 Apple has also developed an internal set of proprietary risk tools allowing Apple toreview data to comprehensively understand the effects of its fraud detection effortsand propose new approaches to fraud attempts. These tools include monitoringmechanisms that utilise AI/ML techniques which aid Apple in being flexible andadaptable in its current and future fraud detection efforts. Risk decision tools areevaluated for their impacts on fraud reduction and adjusted periodically to ensureApple is making the most of its available tools and detection methods. 3.6.4 In 2020, Apple’s combination of technology and human expertise protectedcustomers from more than $1.5 billion in potentially fraudulent transactions. In 2021,Apple protected customers from nearly $1.5 billion in potentially fraudulenttransactions, and stopped more than 1.6 million risky and vulnerable apps and appupdates from defrauding users. 16 In 2022, Apple blocked nearly 3.9 million stolencredit cards from being used to make fraudulent purchases, and banned 714,000accounts from transacting again. In total, in 2022, Apple blocked $2.09 billion infraudulent transactions on the App Store.17 16 https://www.apple.com/newsroom/2022/06/app-store-stopped-nearly-one-point-five-billion-in-fraudulent-transactions-in-2021/17 https://www.apple.com/newsroom/2023/05/app-store-stopped-more-than-2-billion-in-fraudulent-transactions-in-2022/ Non-Confidential Version 21 SECTION 4: THE RISK PROFILE OF THE APP STORE 4.1 Section overview 4.1.1 The DSA identifies, in broad terms, categories of potential Systemic Risk and factorsfor VLOPs to consider in assessing such risks. The DSA also recognises that eachVLOP’s risk assessment should be tailored to the unique “design or functioning oftheir service and related systems ” and “shall be specific to their services andproportionate to the systemic risks ” of that service. 18 This recognises that each VLOPwill have its own distinct risk profile and assessment based on the design, functioningand use of its service. 4.1.2 The App Store 19 provides users of Apple devices with the means to discover anddownload apps from the App Store. From its inception, the App Store was designedto protect users of Apple devices by creating a safe and trusted environment offeringa wide variety of curated apps. Every app and every app update submitted to theApp Store is closely reviewed by both automated systems and human experts trainedto ensure that apps offered on the App Store are safe, provide a good user experience,protect user privacy, and use approved business models. Post publication, apps aresubject to ongoing monitoring and multiple controls ensure that Apple can takeaction when it is alerted to problematic developers or apps. However, Apple cannotmonitor all activity that happens within the app given Apple’s privacy by designprinciples, which means that the apps run on the device so as to minimise the datashared with Apple. 4.1.3 This Risk Assessment addresses the potential systemic risks of the App Store that existwithin the framework of the lifecycle of an app distributed in the App Store. Risksthat arise outside of the App Store are beyond the scope of Article 34. As such, thisSection provides an overview of the lifecycle of an app in the App Store – includingapp discovery, where users learn about and download apps. This Section alsosummarises the stages before app discovery: developer onboarding; app review; andrecommender, advertising, and moderation systems that impact the presentation ofapps and reviews to customers. Finally, this Section addresses the App Store’s noticeand action mechanisms, which help to mitigate potential App Store risks, as well asexternal risks that are the responsibility of developers. 4.1.4 Note that this Section describes how users discover Apps in the App Store service,and the process by which apps are published on the App Store, and notices and 18 Digital Services Act, Article 34(1).19 As noted at footnote 1 above, ADI does not accept that iOS App Store, iPadOS App Store, watchOSApp Store, macOS App Store and tvOS App Store all form part of a single online platform. Rather,they are separate online platforms with significant differences from both a developer and end userperspective. Notwithstanding this, several of the key compliance controls forming part of Apple’srisk mitigation measures under Article 35 of the DSA apply to each of the current App Stores. Assuch, in this risk assessment, save as indicated otherwise, or where obvious from context, use ofthe expression “App Store” should be understood as extending to each of iOS App Store, iPadOSApp Store, watchOS App Store, macOS App Store and tvOS App Store. Non-Confidential Version 22 actions measures, to guide the reader when considering the Systemic Risk assessmentin Section 5. Detailed information regarding the risk mitigation controls mentionedin this Section 4 and their role in mitigating the Systemic Risks is provided in Section6. 4.2 The App Store provides app discovery and distribution 4.2.1 Developers appoint ADI as their commissionaire for the marketing and delivery ofapps to end users in the EU. Those end users are users of Apple devices who discoverand download apps in the App Store, through one of the five landing pages (tabs) –“Today”, “Games”, “Apps”, “Arcade”, and “Search” – or by visiting the product page ofan app.20 4.2.2 Below is an overview of how App Store discovery works from the end user’sperspective, and where they encounter content in the App Store that could inprinciple engage the Systemic Risks. 4.2.3 The App Store operates 175 country- or region-specific “storefronts”, and userstransact through a storefront based on their home country. Each EU Member Statehas a separate storefront. 21 The App Store is available in 40 languages, including 17official languages of the EU.22 Information presented in the App Store is therefore“localised”, such that app metadata 23 is displayed in different languages, dependingon a user’s location and language settings. Editorially curated content (describedbelow) may vary, depending on a user’s location.(a) The “Today” tab 4.2.4 The Today tab is the first page a user sees when they click on the App Store icon ontheir device. Apple considers this a “daily destination” with original stories from AppStore editors, featuring exclusive premieres, new app releases, Apple’s all-timefavourites apps, an “App of the Day”, a “Game of the Day”, and more. It offers tipsand how-to guides to help customers use apps in innovative ways, and showcasesinterviews with inspiring developers. Stories are selected based on curation by theApp Store Editorial team, and they share Apple’s perspective on apps and games andhow they impact users’ lives, using artwork, videos, and developer quotes to bringapps to life. 20 There is some variation between the tabs available on each App Store. The five tabs listed in thisparagraph appear on the iOS and iPadOS App Stores.21 For App Store availability in EU storefronts, see https://support.apple.com/en-us/HT20441122 https://developer.apple.com/localization/23 In this Report, app metadata comprises text (such as title, descriptions and keywords) and visuals(such as icon, screenshots and video) that are shown in the App Store. Non-Confidential Version 23 4.2.5 App Store editors create a curated catalogue of apps for each category in the Todaytab (for example, original stories, tips, how-to guides, interviews, App of the Day,Game of the Day, Now Trending, Collections, Our Favourites, Get Started). For eachcurated category, the Editorial team determines whether to “pin” certain categoriesin designated vertical positions on the Today tab landing page. 4.2.6 The Today tab also features “Top” charts, such as Top Free Games and Top PaidGames with various categories (AR Games, Indie Games, Action Games, Puzzle Games,Racing Games, Simulation Games); Top Free Apps and Top Paid Apps with variouscategories (Apple Watch Apps, Entertainment, Health \& Fitness, Kids, Photo \& Video,Productivity); Top Podcasting Apps; and Top Arcade Games. Apps are selected forcharts based on the most downloads in the App Store within approximately the past24-hour period. 4.2.7 App Store editors can also choose to have categories personalised for the user basedon prior engagement (for example, purchase or download) behaviour in the AppStore. If a story has been personalised, the Today tab would surface and order storiesthat are most relevant based on a user’s purchase and download history. For example,personalised stories related to games may be surfaced as relevant to users whorecently downloaded apps in the games category.(b) The “Games” and “Apps” tabs 4.2.8 The Games and Apps tabs on the App Store provide dedicated experiences for gamesand apps that inform and engage customers through recommendations on newreleases and updates, videos, top charts, and handpicked collections and categories.For these tabs, all apps are selected based on algorithmic relevance, App StoreEditorial curation, and top charts. Non-Confidential Version 24 4.2.9 When considering apps to feature in these tabs, App Store editors look for high-quality apps across all categories, with a particular focus on new apps and apps withsignificant updates.(c)“Arcade” tab 4.2.10 The Arcade tab in the App Store features games which are made available as part ofApple’s subscription service “Apple Arcade”.(d)Search tab 4.2.11 The App Store Search tab provides an additional way for customers to find apps,games, stories, categories, in-app purchases, and developers. Before a user enters asearch, the Search tab shows popular or trending queries in the “Discover” section, aswell as a list of apps that a user may want to search for in the “Suggested” section.These apps are selected based on aggregate search behaviour from informationcurated by Apple’s editors. In some cases, suggested queries may be personalisedfor users in the “Discover” section and apps may be personalised for users in the“Suggested” section, based on prior engagement in the App Store. In sum, the appsshown in Search before a search term is entered are selected based on algorithmicrelevance, App Store Editorial curation, and top charts. Non-Confidential Version 25 4.2.12 Searches use metadata from developers’ product pages to deliver the most relevantresults. The main parameters used for app ranking and discoverability are therelevance of text / titles, keywords, and descriptive categories provided in the appmetadata; user engagement in the App Store, such as the number and quality ofratings and reviews and application downloads. Date of launch in the App Store mayalso be considered for relevant searches.(e)App product page 4.2.13 When a user taps on an app during discovery, they are taken to the app product page,which provides information about the app. Non-Confidential Version 26 4.2.14 Most of the information on the app product page is input by the developer, such asdeveloper and app information; app icons, screenshots, and previews; a privacy policyURL; support links; an age rating; and data handling practices. The App Store alsoprovides customer rating and review information on the app product page. This isthe only UGC on the App Store. If the user has downloaded the app, they see a linkto the Report a Problem feature, which lets customers request a refund, report aquality issue, or report a scam or fraud, or offensive, illegal or abusive content.(f)Apple’s paid app placement option on the App Store (Apple Search Ads) 4.2.15 Developers may also engage in paid promotion of their apps in the App Store throughApple Search Ads which provides a means for third-party developers to increase thevisibility of their apps that are already distributed on the App Store. Through AppleSearch Ads, apps may be displayed in the Today tab; the Search tab and SearchResults; and in the product page while browsing. 4.2.16 Apple Search Ads placements are clearly distinguished from organic App Storeplacements and search results with a prominent “Ad” mark (language localised), andmay include border and background shading demarcations. Tapping on the “Ad”mark designation displays an “About this Ad” sheet, which provides informationabout why the user has been shown that particular Apple Search Ad and what criteria,if any, were used to display the app campaign. Non-Confidential Version 27 4.2.17 Apple Search Ads is a purely optional service for developers, accessible through anindependent account (an Apple Search Ads account), using a different web portalfrom App Store Connect.24 Apple Search Ads were made available to users in certainEU storefronts five years ago; more were added thereafter. 25 Today, Apple SearchAds are available to users in most EU storefronts,26 though only a small percentageof App Store developers choose to promote their apps using Apple Search Ads. Ifdevelopers choose to not use the Apple Search Ads service to promote their app,their app will still appear across the various available organic placements of the AppStore, including within search results, just as it would if the developer had chosen touse Apple Search Ads for securing promoted placements. The two services andplacement algorithms work separately from each other. 4.3 App Store processes and functions help to provide a safe and trusted place for customers to discover and download apps 4.3.1 The content below provides a summary of the App Store process from a developerperspective. 24 App Store connect is a developer tool where developers upload, submit, and manage their apps.25 https://searchads.apple.com/countries-and-regions26 Apple Search Ads is not available to users on the Bulgaria, Estonia, Latvia, Lithuania, Luxembourg,Malta, Slovakia, or Slovenia storefronts. Non-Confidential Version 28 (a)Developers are screened and must agree to terms and conditions 4.3.2 Before an app can be published in the App Store, a developer must register to enrolas an Apple Developer. A developer must sign in with an Apple ID with two-factorauthentication, review and accept the latest terms of the Apple DeveloperAgreement,27 and enter identity information. If the developer is enrolling via theApple Developer app, they are asked to verify their identity with a driver’s licence orgovernment-issued photo ID. 4.3.3 The World Wide Developer Relations team conducts a screening intended to preventfraudulent developers from enrolling, including verifying developer identity,enrolment country, and financial information, as well as automated checks againstexisting and terminated developer accounts to ensure that bad actors (that is to say,developers who have previously committed or appear to intend to commit seriousbreaches of the Apple Developer Agreement (the “ADA”), DPLA or App ReviewGuidelines) and associates do not re-enter the program. In addition, the global exportsanctions compliance team also conducts a sanctions check against the developerinformation to ensure Apple is not prohibited from doing business with the developer. 4.3.4 If a developer passes this round of screening, they can then execute the DPLA,28 andbegin the multi-step process of submitting an app for distribution on the App Store.(b)Automated and human-based app review 4.3.5 The App Review process applies to both new apps and to updates to existing apps(for example, when an app introduces a new version, adds new features, extends tonew platforms, or uses an additional Apple technology). 4.3.6 Every app or app update provided to the App Store for distribution is uploadedthrough App Store Connect, which is a developer tool where developers upload,submit, and manage their apps. Upon submission, the developer creates an apprecord, provides app metadata, along with the app name and description and otherrelevant information. 29 Every app or app update submission is then reviewed by theApp Review team, first via automated means and then by human app reviewers. 4.3.7 The App Review automated process includes static binary analysis, asset analysis, andruntime analysis [CONFIDENTIAL]. The aim of these automated processes is toefficiently gather information that can be interpreted by machine learningalgorithms and analysed for threats and signals (for example, the presence ofmalicious URLs or executable code) that provide relevant app information to thehuman review component. 4.3.8 During human review, app reviewers analyse the signals provided by automatedsystems and review the features and functionality of apps to ensure they are 27 https://developer.apple.com/support/downloads/terms/apple-developer-agreement/Apple-Developer-Agreement-20230605-English.pdf 28 https://developer.apple.com/programs/apple-developer-program-license-agreement/29 https://developer.apple.com/help/app-store-connect/create-an-app-record/add-a-new-app Non-Confidential Version 29 compatible with the App Store’s systems and products, comply with the Guidelines,and do not give signs of potential deceptive, abusive, or otherwise harmful behaviour.If a reviewer detects a potential Guideline violation, they engage with the developer,reject the app or further escalate issues to specialists within the App Review team orto other functional groups, such as the App Store Legal team. If there are noGuideline violations, the app may be approved for publication in the App Store.(c)Post-publication review 4.3.9 The App Review process continues even after an app is first published on the AppStore. Developers are required to submit updates to their apps to the App Reviewteam. This ensures that the App Store reviews apps throughout their entire lifecycle,and can identify new features and functionality that may not comply with theGuidelines. Furthermore, the App Store takes action against apps that exhibitmalicious or other problematic behaviours after they have become available in theApp Store. The App Store has a number of automated tools in place to detectmalware on existing apps, that it runs at periodic intervals to capture content atdifferent times. This includes tools to identify “bait-and-switch” apps, where appsavailable on the App Store change or add new functionality after approval by the AppReview team. Once flagged by automation, these apps are re-reviewed by humanapp reviewers to evaluate whether intervention is needed.(d)[CONFIDENTIAL] 4.3.10 [CONFIDENTIAL].(e)Reviews of user-generated ratings and reviews of apps 4.3.11 The only UGC on the App Store is user-generated app ratings and reviews, both ofwhich are subject to content moderation by the Trust and Safety Operations team.The Trust and Safety Operations team takes both preventative and responsive stepsto ensure that risks arising from ratings and reviews are minimised. These risks mayinclude inauthentic or misleading ratings and reviews, including by users who havenot used the app. 4.3.12 When the App Store is alerted to a concern about a rating or review, it investigatesand may remove a review or developer response, and / or disable the ability to reviewfrom a user account. In certain cases, ratings and reviews are escalated for furtherinvestigation, for example in cases where a reported concern contains maliciousactivity that infers bodily harm, or child safety and / or child exploitation concerns.Reviews that contain information concerning a criminal offense involving a threat tolife or safety will also be escalated and if necessary reported to law enforcement, inaccordance with Article 18 of the DSA. Non-Confidential Version 30 (f)For apps live on the store, the App Store provides avenues for consumers, developers, government authorities and others to provide notice of potential problems or concerns with apps or app content 4.3.13 Customers may use the “Report a Problem” feature to submit notices of offensive,illegal, or abusive content concerning apps they have purchased or downloaded.Report a Problem is accessible via quick links at the bottom of the Games and Appstabs or from the product page of any app purchased or downloaded. Thesesubmissions are screened for manipulation and, if legitimate, forwarded to theappropriate team (for example, the App Review team, or Trust and Safety Operations)to investigate for signs of fraud, manipulation, abuse and other violations of theGuidelines and take action, if necessary. Such action may include working withdevelopers to resolve issues, removing illegal or harmful apps, and / or terminatingdeveloper accounts. As detailed in Section 6, developers have recourse to variousappeal mechanisms in the event that they disagree with Apple’s decision to removeapps or terminate developer accounts. 4.3.14 Developers and users also have the ability to report potential problems or concernswith app reviews or ratings by submitting notices using Apple’s “Report a Concern”function. 30 This feature allows developers to submit a customer review removalrequest, and for developers and users to report concerns with user ratings and reviews,including concerns regarding relevance, spam or fraud. As with customer Report aProblem notices, developer and user notices regarding ratings and reviews areforwarded to the appropriate internal teams for review, investigation, and potentialaction. 4.3.15 If a developer or user believes that an app violates their intellectual property rights,they can submit a claim to the AMS Content Disputes Legal team, using the App Storecontent disputes form.31 The team will put them in direct contact with the developer,as primary responsibility for settling content disputes rests with the parties. In somecircumstances, the AMS Content Disputes Legal team will intervene and take actionagainst developers and apps. 4.3.16 Government authorities from law enforcement and various regulatory agencies maysend notices requesting information or app removals based on alleged or suspectedviolations of local law. Authorities send requests to the App Store to takedown orinvestigate apps via email notice to dedicated email addresses, [CONFIDENTIAL] or,for law enforcement inquiries and notices, lawenforcement@apple.com. Theserequests are vetted by the App Store Legal team. 4.3.17 Where credible information is received from any source (for example users,developers or law enforcement) that a developer is not acting in accordance with theGuidelines or local law, Apple will investigate and take appropriate action, which mayinclude removal of the app from the App Store and removal of the developer fromthe Apple Developer Program. 30 https://developer.apple.com/contact/#!/topic/select/SC1108/subtopic/select31 https://www.apple.com/legal/internet-services/itunes/appstorenotices/#?lang=en Non-Confidential Version 31 4.3.18 In addition, if Apple is alerted to information on the App Store that gives rise to asuspicion that a criminal offence involving a threat to the life or safety of a person orpersons has taken place, is taking place or is likely to take place, as envisaged in Article18 of the DSA, steps will be taken to notify the appropriate law enforcementauthorities.(g)New DSA Notices and Actions Process 4.3.19 In August, pursuant to its DSA obligations, Apple made updates to the Report aProblem process and introduced a new content reports portal. 4.3.20 Users on a storefront in the EU now have the option to select “Report offensive orabusive content” or “Report illegal content” from the options menu. If the user selects“Report offensive or abusive content” the process remains as described in paragraph4.3.13 above. If they select “Report illegal content”, they are redirected to a webportal at ContentReports.apple.com (the “Content Reports portal”). The ContentReports Portal can also be accessed directly via the web. 4.3.21 The Content Reports portal is a central platform where individuals, includinggovernment representatives, and in due course “Trusted Flaggers” 32, can file noticesconcerning alleged illegal content, from which communications concerning thosenotices are processed and sent, and in which data is consolidated for latertransparency reporting purposes. Anyone in the EU can submit concerns aboutalleged illegal content via the Content Reports portal, whether or not they havepurchased or downloaded the app in question. Members of the public can in the EUalso use the portal to anonymously file notices concerning CSAM content. 32 “Trusted Flaggers” are organisations designated under Article 19 of the DSA, which have particularexpertise and competence for the purposes of detecting, identifying and notifying illegal content. Non-Confidential Version 32 SECTION 5: POTENTIAL SYSTEMIC RISKS ARISING FROM THE DESIGN, FUNCTIONING OR USEOF THE APP STORE 5.1 Section overview 5.1.1 This Section contains an assessment of how the Systemic Risks in the EU may stemfrom the design, functionality or use of the App Store.33 5.1.2 Following careful analysis, Apple has not identified any meaningful basis todistinguish risks stemming from the design and function of the App Store from risksstemming from its use. The App Store controls environment has been developedover many years in a manner designed to address issues arising from the way in whichthe App Store is used by developers and end users. Against that background, and toavoid unnecessary and unhelpful artificiality and repetition, Apple has sought toidentify risks as they may arise from the design and function of the App Store, takinginto account its use. 5.1.3 While the concept of Systemic Risk is not comprehensively defined in the DSA, Applehas not identified any risks in the EU beyond or separate from those listed in Article34(1) that might reasonably be said to stem from the design and function of the AppStore, or its use, and that might reasonably be said to be systemic in nature. As such,this risk assessment addresses those Systemic Risks specifically identified in Article34(1). 5.2 Article 34(2) first paragraph factors 5.2.1 Pursuant to Article 34(2) first paragraph, in conducting this risk assessment, Apple isrequired to take account of whether and how certain specified factors may influenceany of the Systemic Risks. Each of the factors are considered in Section 6 of theReport, but Apple notes the following:(a) Recommender systems and other algorithms 5.2.2 Recital 84 of the DSA states that “where the algorithmic amplification of informationcontributes to the Systemic Risks”, this should be reflected in VLOP’s risk assessments. 5.2.3 As detailed in Section 6 (in particular, paragraphs 6.6.1et seq. below), while Applemakes limited use of recommender and other algorithmic systems compared withother VLOPs, end users of the App Store do receive recommendations with respectto a selected and limited set of apps on the App Store that have already beenapproved through the App Review process. Furthermore, the App Storerecommender function makes no use of profiling. There is also a limited searchfunction on the App Store, which allows users to search for App Review approvedapps and content, and which operates by algorithmic means. Some contentplacement can be “personalised”, but users are given the choice to disablepersonalised recommendations (except for children’s accounts, whererecommendations cannot be personalised). 33 The assessment of risks in this Section is limited to those risks that may arise in the EU. Non-Confidential Version 33 5.2.4 Additional controls detailed in Section 6 ensure that any impact of the App Store’suse of recommender systems or other algorithmic systems on the Systemic Risksinvolves ample and specific risk mitigation; in particular, Apple is confident that itscurrent controls regarding the operation of its recommender systems are such thatthose systems do not lead to the amplification of information or disinformation thatcontributes to the Systemic Risks. As such, the impact of this factor on Systemic Risksis taken into account throughout this risk assessment.(b)Content moderation systems 5.2.5 Prior to the passing of the DSA, there were already various content moderationsystems on the App Store, including ongoing monitoring of apps on the App Storeas well as moderation of user ratings and reviews and developer responses (asexplained at paragraphs 6.7.1et seq. below). The impact of these systems on theSystemic Risks is detailed in relevant sections of the Report. Furthermore, Applerequires developers whose apps allow UGC to maintain effective content moderationarrangements. While the significance of UGC on the App Store is dramatically lowerthan as regards some other VLOPs, content moderation is considered in all relevantsections of this risk assessment.(c)Applicable terms and conditions 5.2.6 Apple maintains comprehensive terms and conditions – applicable to bothdevelopers and users – that address key risks facing the App Store, including theSystemic Risks. The terms and conditions provide Apple with a basis for takingprompt action in the event that a developer or user misuses the App Store.Developers and users who object to such action have recourse to various complaintsmechanisms. 5.2.7 These terms and conditions, and the ways in which they and their enforcementfacilitates Apple’s mitigation of Systemic Risks, are addressed extensively throughoutthis risk assessment.(d) Systems for selecting and presenting advertising 5.2.8 Recital 88 provides that “The advertising systems used by [VLOPs...] can also be acatalyser for the systemic risks”. 5.2.9 As detailed in Section 6, the only developer promotion of an app on the App Storeappears in Apple Search Ads. These are subject to controls and in any event do notcontain any “new” advertising content; this is a system that developers can use topromote apps that have already been approved. As such, Apple does not considerthat Apple Search Ads can to any meaningful extent be reasonably or objectively saidto be a catalyser for the Systemic Risks. 5.2.10 Apple further notes that Recital 79 to the DSA suggests that the way in which VLOPs“design their services is generally optimized to benefit their often advertising-drivenbusiness models and can cause societal concerns.” Although certain VLOPs maydesign their services in this way, it is certainly not the case for the App Store, whereApple Search Ads only provides developers an opportunity to promote their apps and Non-Confidential Version 34 not to “advertise” additional content. The promoted apps have already beenreviewed and approved for the App Store.(e)Data-related practices of the provider 5.2.11 Apple’s data-related practices are a central differentiator of the App Store, and thewhole Apple ecosystem; Apple provides its customers with market-leading standardsof protection of privacy, complying in full with applicable data privacy laws. 5.2.12 This risk assessment, including the assessment of the Charter right to the protectionof personal data at paragraph 5.7.13et seq. below, addresses extensively all relevantprivacy and data protection considerations, including those that apply at the Appleecosystem level, and those specific to the App Store. 5.3 Intentional manipulation of the App Store 5.3.1 Furthermore, pursuant to Article 34(2) second paragraph, Apple is required to analysehow the Systemic Risks are influenced by intentional manipulation of the App Store.In this regard, Recital 84 provides that:“... Providers of very large online platforms ... should, in particular, assess howthe design and functioning of their service, as well as the intentional and,oftentimes, coordinated manipulation and use of their services, or the systemicinfringement of their terms of service, contribute to such risks. Such risks mayarise, for example, through the inauthentic use of the service, such as thecreation of fake accounts, the use of bots or deceptive use of a service, andother automated or partially automated behaviours, which may lead to therapid and widespread dissemination to the public of information that is illegalcontent or incompatible with an online platform’s ... terms and conditions andthat contributes to disinformation campaigns.” 5.3.2 Malicious actors are constantly seeking to circumvent App Store risk mitigationmeasures so as to publish or promote apps on the App Store. Where relevant,particularly with respect to “illegal content”, Apple has addressed and factored suchintentional manipulation into its risk analysis. 5.4 Regional or linguistic aspects 5.4.1 Pursuant to Article 34(2) third paragraph, Apple is also required to take into accountspecific regional or linguistic aspects, including any that are specific to a particularMember State, when assessing the Systemic Risks. Recital 84 provides that “Whererisks are localised or there are linguistic differences ”, VLOPs should account for thisin their risk assessments. 5.4.2 Apple does not consider that regional or linguistic aspects have a material impact onthe Systemic Risks that might reasonably be argued to stem from the App Store. TheApp Store is available in 40 languages. While individual storefronts may address usersin or with a connection to particular Member States, and while linguistic and localeditorial coverage is provided across those regions and languages, the App Storeservice and risk mitigation measures are not substantively variegated across the EU, Non-Confidential Version 35 other than may be required by law. Nonetheless, where appropriate in Section 6below, we refer to regional or linguistic considerations within the EU. 5.5 The Systemic Risks and the App Store 5.5.1 Given the integrated nature of the risk mitigation measures implemented andenhanced by Apple since the launch of the App Store, seeking to identify the systemicrisk profile without reference to all mitigation measures inevitably involves someartificiality. Apple recognises that without effective controls any app store, includingthe App Store, could be open to serious abuse by malicious actors that could engagethe Systemic Risks. Since its inception, the guiding principle of the App Store hasbeen to maintain a safe and trusted place for end users to discover and downloadapps, and extensive controls are in place to ensure that the apps that are offered onthe App Store are held to the highest standards for privacy, security, safety and quality.Apple has taken and continues to take steps to keep the App Store a safe place, andto give users control over their preferences, irrespective of any legislative initiatives,such as the DSA. 5.5.2 Apple directly mitigates risks from apps or UGC on the App Store. Developers andconsumers are nearest to the source and primarily mitigate risks that arise outside ofthe App Store. For those risks, developers must engage in risk mitigation measures(such as their own content moderation systems). While Apple’s privacy by designprinciples mean that Apple cannot carry out an on-going review of UGC in the app,Apple considers that it is critical for the integrity of its ecosystem to invest in themitigation of those risks, as well, including by making extensive tools available todevelopers and consumers for those purposes and by requiring developers tomaintain certain safeguards in accordance with the DPLA and the Guidelines. Applealso conducts ongoing App Review to help mitigate even those risks which areoutside of Apple’s control, as set out further below in Section 6. Such comprehensivecontrols which comprise the security architecture of the App Store are necessary toeffectively mitigate risks throughout the lifecycle of an app distributed via the AppStore. 5.5.3 However, those risks which do not stem from the design or function of the App Store,or from its use (as opposed to the use of such third-party apps), are extraneous tothe App Store. Developers have responsibilities to mitigate risks to users (includingthose required by Apple under the DPLA), and those which are themselves VLOPs willhave their own new risk mitigation measures under the DSA. Risks arising from thedesign, function or use of their services are not the responsibility of Apple; althoughthey may engage obligations owed to Apple under the DPLA, and are subject to theApp Review process. 5.5.4 If Apple identifies through App Review or is alerted to content on third-party appsdownloaded on a user’s device that engages the Systemic Risks, its practice is tomitigate those risks as efficiently as practicable. Apple typically first brings suchmatters to the attention of the app developer so that they can take action. In theevent that the developer fails to take appropriate action, Apple can take measures toprevent further distribution via downloads or re-downloads from the App Store, but Non-Confidential Version 36 those actions are in response to risks that stem from use of third-party apps, not useof the App Store, and are therefore independent of any liability under the DSA. 5.5.5 In this Section of this report, in assessing each of the Systemic Risks specified in Article34(1), and in considering probability of such risks arising and the severity of anyresulting impacts, Apple has sought to take into account the level of inherent risk,without regard to the extensive App Store risk mitigation measures that address therisk in question, save to the extent that it would be wholly artificial to do so, giventhat many of the risk mitigation measures are so integral to the way the App Storeoperates, and so fundamental to its design. Those mitigation measures are addressedin Section 6. 5.6 Article 34(1)(a) – Dissemination of illegal content 5.6.1 “Illegal content” is defined in the DSA as “any information that, in itself or in relationto any activity including the sale of products or the provision of services, is not incompliance with Union law or the law of any member State which is in compliancewith Union law, irrespective of the precise subject matter or nature of that law ”.Recital 80 of the DSA provides, as examples of “illegal content”, child sex abusematerial or illegal hate speech or other types of misuse of the service for criminalpurposes and the conduct of illegal activities. Such dissemination may become asignificant systemic risk “where access to illegal content may spread rapidly andwidely through accounts with a particularly wide reach or other means ofamplification. ” 34 Apple notes that amplification or proliferation of content (whichmay contain illegal content) does not form part of the business model of the AppStore. 34 Recital 12 further provides that “In order to achieve the objective of ensuring a safe, predictableand trustworthy online environment, for the purpose of this Regulation the concept of ‘illegalcontent’ should broadly reflect the existing rules in the offline environment. In particular, theconcept of ‘illegal content’ should be defined broadly to cover information relating to illegalcontent, products, services and activities. In particular, that concept should be understood to referto information, irrespective of its form, that under the applicable law is either itself illegal, such asillegal hate speech or terrorist content and unlawful discriminatory content, or that the applicablerules render illegal in view of the fact that it relates to illegal activities. Illustrative examples includethe sharing of images depicting child sexual abuse, the unlawful non-consensual sharing of privateimages, online stalking, the sale of non-compliant or counterfeit products, the sale of products orthe provision of services in infringement of consumer protection law, the non-authorised use ofcopyright protected material, the illegal offer of accommodation services or the illegal sale of liveanimals. In contrast, an eyewitness video of a potential crime should not be considered toconstitute illegal content, merely because it depicts an illegal act, where recording or disseminatingsuch a video to the public is not illegal under national or Union law. In this regard, it is immaterialwhether the illegality of the information or activity results from Union law or from national law thatis in compliance with Union law and what the precise nature or subject matter is of the law inquestion.” Non-Confidential Version 37 (f)Developer content 5.6.2 As with any online platform, there is a material risk that, absent risk mitigationmeasures, the App Store could be used to disseminate certain categories of illegalcontent to users in the EU. This could include, without limitation: (a) apps designed to disseminate illegal content or facilitate illegal behaviours,such as fraud, including “bait-and-switch” apps; (b) apps that infringe the intellectual property rights of others; (c) apps that facilitate activities that are illegal in certain Member States (forexample, certain types of real money gambling); (d) in-app content that is defamatory or intended to offend; or (e) developer responses to user reviews that are intended to mislead or induceimproper behaviours. 5.6.3 However, the App Store developer screening measures, App Review process, contentmoderation practices and notices and actions procedures are designed to and dominimise the potential for dissemination of illegal content or the use of the servicefor unlawful purposes, and seek to swiftly identify any such content or behaviours atthe earliest possible juncture so as to minimise the possibility of their amplification. 5.6.4 Notwithstanding these controls, as noted above, malicious actors are, in practice,constantly trying to evade the App Store’s controls, including through inauthentic useand intentional manipulation of the App Store; in that sense, this Systemic Risk doesarise in practice. The App Store 2022 Transparency Report provides some insight intothe scale of the threat. In 2022, the App Store rejected 1,679,694 apps / app updatesfor safety and legal reasons; it removed 186,195 apps for fraud, IP infringements,Copycats, and other legal reasons. 5.6.5 As such, absent appropriate controls, the risk of the App Store being used todisseminate illegal content would be high, and, depending on the type of illegalcontent, the severity of impact of such risk crystallising could range from moderateto extreme (such as in the case of terrorist content or CSAM). However, the App Storemaintains risk mitigation measures to address these risks.(g) User content 5.6.6 The only UGC on the App Store (as opposed to content generated by developers; andUGC within third-party apps) appears in user ratings and reviews of apps available onthe App Store. 5.6.7 The risk that App Store-hosted UGC may give rise to the dissemination of illegalcontent is low to moderate, and most likely to arise through offensive statements,defamation, harassment, and potentially through co-ordinated disinformation orfraudulent campaigns in favour of or against a particular app or developer. However,the limited presence of UGC and distribution thereof makes the App Store asignificantly less likely target of such practices, compared with other platforms. Non-Confidential Version 38 Furthermore, Apple moderates all user ratings and reviews and developer responses,and takes action against users and developers who do not comply with applicableratings and reviews terms and conditions. 5.7 Article 34(1)(b) – Actual or foreseeable negative effects for the exercise of fundamental rights 5.7.1 Article 1 of the DSA provides that its aim is to “contribute to the proper functioningof the internal market for intermediary services by setting out harmonised rules for asafe, predictable and trusted online environment that facilitates innovation and inwhich fundamental rights enshrined in the Charter, including the principle ofconsumer protection, are effectively protected.” 5.7.2 Recital 81 of the DSA provides that VLOPs must assess the “impact of the service onthe exercise of fundamental rights ”. It explains that: “Such risks may arise, for example, in relation to the design of the algorithmicsystems used by the [VLOP...] or the misuse of their service through thesubmission of abusive notices or other methods for silencing speech orhampering competition. When assessing risks to the rights of the child,providers of [VLOPs...] should consider for example how easy it is for minors tounderstand the design and functioning of the service, as well as how minorscan be exposed through their service to content that may impair minors’ health,physical, mental and moral development. Such risks may arise, for example, inrelation to the design of online interfaces which intentionally or unintentionallyexploit the weaknesses and inexperience of minors or which may causeaddictive behaviour.”5.7.3 The App Store is primarily a vehicle for the promotion and fulfilment of fundamentalrights, in particular freedom of expression and information, offering developersopportunities to distribute their apps to the users of Apple devices, and those usersto discover and download apps. 5.7.4 Apple notes that human app reviewers on the App Review team are trained to reviewapps with a view to identifying potential human rights concerns. For example, with aview to safeguarding individuals and users, human reviewers examine each and everyapp and each and every app update submitted for App Review against the terms ofthe Guidelines that clearly prohibit app content that is “offensive, insensitive,upsetting, intended to disgust...”, including “references to commentary about religion,race, sexual orientation or other targeted groups...”. 5.7.5 Apple considers that any Charter Rights risks associated with the design, function oruse of the App Store primarily are those set out below.(h) Rights to human dignity and respect for private and family life, enshrined in Articles 1 and 7 of the Charter 5.7.6 Use of the App Store is capable of engaging (and therefore conceivably capable ofgiving rise to negative effects) the right to human dignity in Article 1 and the right torespect for private and family life in Article 7. Given the close relationship between Non-Confidential Version 39 those rights, and the ways in which they may be engaged in connection with the AppStore, they are considered together for the purposes of this risk assessment. 5.7.7 Developer use of the App Store may engage these rights where apps are submittedwith relevant malign intent, or containing illicit app binary functionality, or lacking thecontrols required for apps of the relevant kind by the Guidelines (such as, for example,an app encouraging UGC which is not supported by appropriate content moderationmeasures). 5.7.8 Absent adequate controls, the likelihood of developers seeking to publish appscapable of giving rise to actual or foreseeable negative effects on the rights to humandignity and respect for private and family life would be high, and the severity of suchrisks could vary from modest to extreme (for example, in the cases of CSAM, so-called“revenge pornography”, “deepfakes”, etc.); indeed, in practice, action does from timeto time have to be taken to block or remove apps containing such content.Nonetheless, the App Store maintains risk mitigation measures to address these risks.(i)Developers’ and users’ rights to the protection of personal data enshrined in Article 8 of the Charter 5.7.9 The right to protection of personal data is closely associated with the right to privacyand the right to human dignity. 5.7.10 When users interact with an app store via their device, the app store provider cancollect and process their personal data in a number of different ways. This couldinclude profiling their user behaviour in the application store, including by trackingtheir browsing and searching activities, and processing their personal data forpresenting recommended apps and other content, including advertising material.App store providers could also share this personal data with third parties, includingdata brokers. 5.7.11 Without appropriate risk mitigation measures on the App Store, there would be asignificant risk that there could be negative effects on developers’ and users’ rightsto the protection of their personal data. 5.7.12 However, as detailed in Section 3 above (in respect of Apple ecosystem privacypractices) and Section 6 below (in respect of App Store specific privacy practices), theApp Store maintains comprehensive policies relating to privacy and data protection,and uses on-device processing to enhance recommendations and mitigate privacyrisks.(j) The rights of developers and users to freedom of expression and freedom of information, including the freedom and pluralism of the media, under Article 11 of the Charter 5.7.13 Developers’ and users’ rights to freedom of expression and information are engagedwhen they interact with the App Store. Nonetheless, Apple recognises that there is abalance to be struck between freedom of expression and other rights and interestswhich might be adversely affected by untrammelled exercise of free expression (for Non-Confidential Version 40 example, rights to dignity, privacy, and freedom from discrimination). TheIntroduction to the Guidelines reflects the App Store’s approach: “We strongly support all points of view being represented on the App Store, aslong as the apps are respectful to users with differing opinions and the qualityof the app experience is great. We will reject apps for any content or behaviorthat we believe is over the line. What line, you ask? Well, as a Supreme CourtJustice once said, “I’ll know it when I see it”. And we think that you will alsoknow it when you cross it. ” 5.7.14 Developers and users are free to use the App Store, save where they do not complywith the law or the Guidelines, which are designed to keep the App Store a safe andtrusted place for all. Each of the rights to freedom of expression and information issusceptible to proportionate limitation, which is the purpose and effect of theGuidelines, and the risk mitigation measures applicable to the App Store generally.As such, while such risks may conceivably arise in connection with the App Store, theprobability of negative effects on these rights arising in practice can only reasonablybe seen as remote; and their impact, should they arise, modest. In any event, wheredevelopers and users disagree with Apple’s decisions that could engage freedom ofexpression and information, there are complaints processes available to address suchconcerns. 5.7.15 Recital 81 of the DSA refers to freedom of expression or information being threatenedby misuse, including the submission of “abusive notices or other methods forsilencing or hampering competition”. In the context of the App Store, this risk canarise in the context of abusive challenges to published apps or improper or bad faithratings and reviews about an app submitted by competitors. This risk may arise fromdeveloper or end user use of the App Store. App Store controls are designed to anddo protect against these risks. 5.7.16 The App Store is a vehicle for media pluralism across the EU, counting among itsdevelopers a very wide range of media voices. The App Store is not a news serviceor news aggregator. Media apps are available on the App Store unless illegal orotherwise in breach of the Guidelines. While the risk of repressive governmentsseeking to abuse powers to require takedown of apps or content cannot bediscounted, in practice, the prevalence of such behaviour within the EU is low (albeit,non-negligible), and would be subject to legal challenge with strong prospects ofsuccess under domestic rights norms in the Member States, informed by theEuropean Convention on Human Rights. 5.7.17 Apple notes that in its November 2022 Discussion Document on Media plurality andonline news,35 Ofcom, the UK’s Office of Communications, makes no mention of theApp Store, which tends to corroborate the view that any risks of negative impacts onmedia pluralism stemming from the design, function or use of the App Store are low.Apple has also not identified any commentary from the European Parliament or 35 https://www.ofcom.org.uk/ data/assets/pdf file/0030/247548/discussion-media-plurality.pdf Non-Confidential Version 41 European Commission that refers to the App Store giving rise to a systemic risk tomedia plurality in the EU.(k)The right to non-discrimination under Article 21 of the Charter 5.7.18 Article 21 of the Charter provides that discrimination based on any ground such assex, race, colour, ethnic or social origin, genetic features, language, religion or belief,political or any other opinion, membership of a national minority, property, birth,disability, age or sexual orientation shall be prohibited. Article 21(2) of the Charteralso imposes a prohibition on discrimination based on nationality. 5.7.19 In principle, an app store could discriminate against users or developers on prohibitedgrounds when granting them access to the service, when reviewing whether apps willbe published on the service, or when determining which apps will be made availableto them. 5.7.20 Apple does not discriminate against developers or users, including when conductingdeveloper screening, App review, or responding to notices and actions (includingfrom law enforcement). 5.7.21 As regards app recommendations and Apple Search Ads, if a user has personalisationturned on, age, gender and location are used to present personalised content, butsuch conduct does not amount to discrimination (and in any event ad personalisationcan be switched off). 5.7.22 As regards developer use, although discriminatory content is clearly prohibited underthe Guidelines, there is a risk that users could be exposed to such content in the AppStore if it were not identified during the App Review process. However, app reviewersare trained to identify such content, and the notices and actions and complaintsmechanisms provide means to raise relevant concerns regarding apps that are alreadypublished on the App Store.(l) The freedom to conduct a business under Article 16 of the Charter (to the extent that a developers’ apps must follow the rules of the App Store) 5.7.23 For the purposes of this risk assessment, Apple has considered whether a developer’srights to freedom to conduct a business could conceivably be negatively affected ifthey were prevented without justification from distributing apps on the App Store, orthe developer were terminated or its apps taken down without justification. 5.7.24 While this right could conceivably be engaged in such circumstances, a number offactors indicate that the probability of negative impacts on the enjoyment of this rightarising in practice is low, and the severity of impacts modest: (a) First, the right under Article 16 does not imply a right to enter into contractualrelationships with any given counterparty; (b) Second, any engagement of this right through developer termination orrestrictions on apps would be substantially mitigated by the existence ofnumerous other platforms and other media on which apps may be publishedand distributed; Non-Confidential Version 42 (c) Third, under the Charter, this freedom is susceptible to proportionate limitation,which is the purpose of the App Store risk mitigation measures generally; and (d) Fourth, developers have at their disposal numerous options for contestingunfavourable decisions relating to the publication of apps on the App Store,including an internal appeals process, mediation vehicles (such as through themechanism afforded under the Platform-to-Business Regulation 36 ) and thecourts. 5.7.25 As such, while a developer’s business may be affected by a decision on Apple’s part,it does not follow that the developer’s right under Article 16 is engaged by such adecision; and even were it accepted that the right could be engaged, any concernsarising under this Article can only reasonably be seen as highly remote, and theimpact of such concerns, very modest.(m)The rights of the child enshrined in Article 24 of the Charter 5.7.26 Article 24 of the Charter provides that “Children shall have the right to such protectionand care as is necessary for their well-being. They may express their views freely. ” 5.7.27 Apple notes the reference in Recital 71 of the DSA to the new European strategy for a betterinternet for kids (BIK+). The three pillars of BIK+ are (1) safe digital experiences to protectchildren from harmful and illegal content, conduct, contact and risks... and to improve theirwell-being online; (2) digital empowerment so all children, also those in situations ofvulnerability, acquire necessary skills and competences to make sound choices and expressthemselves in the online environment safely and responsibility; and (3) active participation,respecting them by giving them a say in the digital environment. 5.7.28 The App Store is not a service that is directed at or predominantly used by minors.However, Apple recognises that minors access apps available on the App Store andmaintains controls to ensure that they are protected. The introductory section to theGuidelines reminds developers:“We have lots of kids downloading lots of apps.Parental controls work great to protect kids, but you have to do your part too. Soknow that we’re keeping an eye out for the kids.” Apple notes that a multitude ofapps available on the App Store allow parents and guardians to enable their childrento learn and acquire new skills to enhance their digital empowerment. 5.7.29 An app store not protected by appropriate risk mitigation measures could give riseto, or be used in a manner giving rise to, risks under this provision. In practice, Appledoes enforce the Guidelines to restrict apps or app content which may be harmful tochildren, and, as detailed in Section 6, maintains a number of controls to protectchildren. Moreover, as detailed in Section 3, Apple provides parents and guardianswith a suite of controls to give them greater choice and oversight of the manner inwhich their children engage with apps on the App Store. 36 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 onpromoting fairness and transparency for business users of online intermediation services. Non-Confidential Version 43 5.7.30 Were such a Systemic Risk to crystallise, the potential impacts could, again, be severe.Nonetheless, Apple considers that its relevant risk mitigation measures are reasonablyand proportionately designed to address this risk. As such, the risk in this area arisingfrom the design or functionality of the App Store must fairly and reasonably beconsidered to be low.(n)High level of consumer protection, enshrined in Article 38 of the Charter 5.7.31 Article 38 of the Charter provides that “Union policies shall ensure a high level ofconsumer protection. ” As acknowledged in Article 1 of the DSA, Article 38 reflects arequirement on the EU, and not a right having horizontal effect, capable ofenforcement as between private persons. Nonetheless, Apple interprets theobligation in Article 34(1)(b) as including a requirement to assess whether the design,functionality or use of the App Store gives rise to any actual or foreseeable negativeeffects on the provision of a high level of consumer protection to end users of theApp Store in the EU. It does not interpret this reference to Article 38 of the Charterin Article 34(1)(b) of the DSA to imply a requirement to assess the App Store’scompliance with the consumer protectionacquis of the EU generally, nor consumerprotection laws of each Member State. 5.7.32 The totality of the risk mitigation measures detailed in Section 6 are all designed toensure that consumers (and indeed developers) are protected when they engage withthe App Store. 5.7.33 Absent appropriate controls, the risks of negative effects on consumer protection(across a broad range of potential negative outcomes) would be high, as would bethe potential severity of impacts. 5.7.34 Notwithstanding the above, protection of consumers is a foundational principle ofthe App Store, and the combined effect of the App Store’s various risk mitigationmeasures is to provide end users with a market-leading level of consumer protection. 5.8 Article 34(1)(c) – Actual or foreseeable negative effects on civic discourse, electoral processes and public security 5.8.1 The App Store is a vehicle for the promotion of both private and public civic discourse.Government agencies, non-profits and citizens use the App Store to disseminate appsthat contain information and allow them to communicate on matters relating toelectoral processes, information relevant to civic discourse and public security. 5.8.2 The purpose and scope of the Systemic Risk referred to in Article 34(1)(c) is not furtherexplained in the recitals to the DSA, although recital (79) contends that VLOPs can beused in “the shaping of public opinion and discourse”.(a) Electoral processes 5.8.3 While online platforms can be used to disseminate false information that threatensmeaningful debate and electoral processes, and which facilitates the spread ofcommunications antithetical to public security, the likelihood of the App Store beingused for such purposes is very substantially lower than for online platforms focussing Non-Confidential Version 44 primarily on UGC. In the App Store, other than apps submitted by malign actors formalign purposes, this would only seem likely to arise in the context of targeteddisinformation in ratings and reviews of apps related to civic discourse, electoralprocesses or public security. 5.8.4 While the potential impacts of such risks crystallising would range from modest topotentially severe, Apple considers its risk mitigation measures to be reasonably andproportionately designed to address this category of Systemic Risk to the extent thatit arises from use of the App Store, and to be effective in doing so in practice (as towhich, see Section 7 of this Report).(b) Civic discourse and public security (including disinformation) 5.8.5 As regards potential negative effects on civic discourse and public security, the AppStore does not give rise to such risk to an extent remotely comparable with thoseonline platforms whose business models are driven by the widespread disseminationand rapid amplification of content, including UGC or news. The App Store’s developerand app approval processes (and its ongoing review of live apps) include controlsdesigned to identify apps intended to have an adverse impact on civic discourse, forexample those apps designed to disseminate unlawful extremist content ordisinformation. 5.8.6 Recital 84 of the DSA provides that “When assessing the systemic risks identified inthis Regulation, [VLOPs] should also focus on the information which is not illegal, butcontributes to the systemic risks identified in this Regulation. [VLOPs] shouldtherefore pay particular attention on how their services are used to disseminate oramplify misleading or deceptive content, including disinformation”. 5.8.7 In practice, Apple does enforce the Guidelines from time to time on grounds capableof having a limiting effect on civic discourse, such as taking action in circumstanceswhere apps include offensive content, or harmful concepts which capitalise or seekto profit on recent or current events, such as violent conflicts, terrorist attacks, orepidemics. 5.8.8 Apple recognises that certain messaging or social media apps that are available onthe App Store (and other app stores) have been found to be used to communicateduring protests and in times of civil unrest, and that such communications could beseen to adversely impact public security. To the extent that the use of these appsgives rise to public security concerns, such use does not stem from the design,function or use of the App Store. To the extent that users are using an app todisseminate illegal content or incite illegal behaviour, primary responsibility for thatcontent or conduct lies with the user in question, albeit that the developer may haveresponsibility for the design, function or use of that app. 5.8.9 The risk that user ratings or reviews of apps hosted on the App Store may negativelyaffect civic discourse, electoral processes, or public security, is low, albeit theoreticallypossible through co-ordinated disinformation campaigns relating to matters such aspublic health or security, or to influence elections, or through manipulative behaviourto influence ratings of apps relevant to these matters through use of bots. Apple Non-Confidential Version 45 considers the stringent controls that apply to App Store-hosted UGC proportionateto the risk posed by such content. 5.9 Article 34(1)(d) – Actual or foreseeable negative effects on gender-based violence, the protection of public health and minors and serious negative consequences to a person’s physical and mental well-being 5.9.1 Recital 83 provides that risks to the protection of public health, minors and seriousnegative consequences to a person’s physical and mental well-being, or on gender-based violence may “stem from coordinated disinformation campaigns related topublic health, or from online interface design that may stimulate behaviouraladdictions of recipients of the service”.(a) Gender-based violence 5.9.2 The risk of actual and foreseeable negative effects stemming from the design,function or use of the App Store relating to gender violence may arise fromdissemination in problematic apps or problematic UGC. 5.9.3 The risk of the App Store being used to disseminate apps having a potential adverseeffect on gender-based violence is similar to the risks described relating to illegalcontent under Article 34(1)(a) above. 5.9.4 Similarly, the App Store’s controls that protect against illegal and harmful contentextend to any app designed to be used in such a way as to have an actual orforeseeable negative effect on gender-based violence. 5.9.5 The probability of exposure to this category of Systemic Risk is similar to thatdescribed above for illegal content generally, and were such risks to crystallise, thepotential impacts could, again, be severe. Again, however, the App Store maintainsrisk mitigation measures to address these risks. 5.9.6 In light of the UGC content moderation controls, the risk of user ratings or reviews ofapps hosted on the App Store producing negative effects on gender-based violenceis low.(b) Protection of minors 5.9.7 Apple has set out at paragraph 5.7.26et seq. its assessment of the Systemic Riskregarding the rights of the child under Article 24 of the Charter.(c) Protection of public health, serious negative consequences to a person’s physical and mental well-being 5.9.8 Risks to public and individual health do not arise from the use of the App Store in amanner or to an extent remotely comparable with those online platforms whosedesign, function and / or use involve the widespread dissemination and rapidamplification of UGC. The App Store’s developer and app approval processes (andits ongoing review of live apps) include controls designed to identify apps intendedto have an adverse impact on public health. Non-Confidential Version 46 5.9.9 Engagement with the App Store does not give rise to addiction issues that have thepotential to cause serious negative consequences to a person’s physical and mentalwell-being. To the extent that such risks arise outside of the App Store after usersdownload apps, Apple’s Screen Time functionality, referred to in Section 3, can beused by adults and minors (and their parents) to track and control the time they arespending on particular apps. 5.9.10 The probability of exposure to this category of Systemic Risk arising from thedeveloper use of the App Store, should fairly and reasonably be considered to be nomore than modest, although, to the extent that such risk were to crystallise, theirimpact could be significant. Again, Apple considers its risk mitigation measures to beproportionate and effective in this regard. 5.9.11 In light of the UGC content moderation controls, the risk that user ratings or reviewsof apps hosted on the App Store may produce negative effects on public health,physical and mental well-being is low. 5.10 Consumer use of apps downloaded from the App Store 5.10.1 This section addresses Apple’s approach to Systemic Risks which may arise fromconsumer use of an app that has been downloaded from the App Store. 5.10.2 As described above, the Systemic Risks may stem from third-party UGC within an app.Those risks do not stem from the design, function or use of the App Store. They stemfrom the consumer’s use of the app. Nor are these risks susceptible to direct controlby Apple or by the risk mitigations in place in respect of the App Store; primaryresponsibility for mitigation of risks arising in connection with such content rests withthe developer of the app. If Apple is alerted to UGC engaging Systemic Risks onthird-party apps, its practice is to bring such matters to the attention of the appdeveloper so that action can be taken; in the event that appropriate action is nottaken, Apple has a range of measures it can take to enforce its requirements ondevelopers under the DSA, but those actions are outside the scope of the DSA, asthey stem from the use of third-party apps, not use of the App Store. 5.10.3 Nonetheless, as described below, there are controls in place in respect of the AppStore to enable action to be taken to address inappropriate or unlawful UGC in appspublished on the App Store.(a) Mitigation of risks stemming from UGC within a developer’s app 5.10.4 Many online platforms that offer an app on the App Store in the EU are or will bethemselves subject to the DSA, in some cases as VLOPs. Those platforms haveprimary responsibility for the services they offer and any content, including UGC,hosted on their platform. While those platforms are required by Apple to have inplace content moderation systems in order to be approved for publication on theApp Store, responsibility for moderating UGC on those apps falls to the app developerof the platform, not to the App Store. Nevertheless, apps can and are removed fromthe App Store if Apple determines that an app does not comply with Section 1.2 ofthe Guidelines (User-Generated Content). Non-Confidential Version 47 5.10.5 While concerns around UGC engaging Systemic Risks on such platforms may bebrought to the attention of the various teams responsible for the operation of theApp Store, Apple is neither required under the DSA, nor in a position to monitor suchUGC, and in practice its primary recourse is to bring such UGC to the attention of thedeveloper of the app on which the offending UGC is hosted (see paragraph 5.10.2above). In the event that the developer fails to act on such a report, Apple mayremove the app and / or terminate the developer in accordance with the App Storeterms and conditions.(b) Obligations for developers for apps already published on the App Store 5.10.6 In order for a developer to submit its app to the App Store for distribution, it mustcomply with the Guidelines. Controls are in place which relate specifically to in-appfunctionality and content (including specific references to user-generated content),and content moderation. Under the terms of Apple’s contractual framework, withwhich all developers must comply, it is made clear that developers are responsible forcomplying not only with the Guidelines, but also with all applicable laws. 5.10.7 For example: (a) The ‘Before You Submit’ section of the Guidelines makes clear that “[i]f yourapp no longer functions as intended or you’re no longer actively supporting it,it will be removed from the App Store”. Further, it is clear from Guideline 2.3.1(Metadata) that developers should not “include any hidden, dormant, orundocumented features in your app; your app’s functionality should be clear toend users and App Review. ” This is echoed by Section 3.3.3 of the DPLA, whichprovides that “an Application may not provide, unlock or enable additionalfeatures or functionality through distribution mechanisms other than the AppStore, Custom App Distribution or TestFlight ”.(b) As regards app content, the Guidelines stipulate that “Apps should not includecontent that is offensive, insensitive, upsetting, intended to disgust, inexceptionally poor taste, or just plain creepy” (Guideline 1.1 (ObjectionableContent)). Examples given in the Guidelines of objectionable content includedefamatory, discriminatory, or mean-spirited content; realistic portrayals ofviolence including people or animals being killed, maimed, tortured, or abused;overtly sexual or pornographic material; and harmful concepts which capitaliseor seek to profit on recent or current events. (c) Developers must also take steps to enable moderation of an app’s user-generated content, in particular those apps which contain user-generatedcontent. Guideline 1.2 (User-Generated Content) provides that apps whichinclude user-generated content must include tools to prevent abuse, includinga method for filtering objectionable material from being posted to the app, amechanism to report offensive content and timely responses to concerns, theability to block abusive users from the service, and published contactinformation so users can easily reach the relevant developer. Non-Confidential Version 48 5.11 Risks if Apple’s key mitigation measures do not fully address the Systemic Risks 5.11.1 For the reasons set out above, although Apple considers that each of the SystemicRisks described above could stem from the design, function or use of the App Store,they are all risks that Apple recognises and mitigates. And, as described in Section 6below, Apple’s risk mitigation measures are continually adapted and improved tobuild on learning and address ever-evolving risks. 5.11.2 As with any controls framework, there is always a measure of risk arising from the factthat the existing risk mitigation measures in place cannot be expected to have a 100%success rate to mitigate the Systemic Risks which may stem from the App Store,particularly as the nature of threats evolve. These may include, for example: (a) the risk that the App Store’s terms and conditions do not fully address theSystemic Risks or afford Apple a basis for enforcing them in order to mitigateSystemic Risks; (b) the risk that the developer onboarding process fails to identify a developerwhose intent to is publish apps which may give rise to Systemic Risks; (c) the risk that the automated review systems of the App Review process fail todetect illicit app binary functionality; (d) the risk that the App Store human app review does not identify apps that donot comply with the terms and conditions, in particular the Guidelines; (e) the risk that recommender and algorithmic systems deployed in connectionwith the App Store recommend and display apps that are illegal in specificregions or have adverse impacts in respect of the other Systemic Risks; (f) the risk that the App Store systems that moderate UGC on the App Store (thatis to say app ratings and reviews) do not detect and remove UGC engaging theSystemic Risks; and (g) the risk that the App Store notice and action systems do not adequately providea means for Apple employees, or developers, users or third parties to raise alertsregarding apps or UGC engaging Systemic Risks. 5.11.3 These matters are among those considered in addressing the reasonableness,proportionality and effectiveness of Apple’s App Store risk mitigation measures inSection 7. Non-Confidential Version 49 SECTION 6: MITIGATION OF POTENTIAL SYSTEMIC RISKS ARISING FROM THE DESIGN,FUNCTIONING OR USE OF THE APP STORE 6.1 Section overview 6.1.1 Section 3 of this Risk Assessment provides details regarding relevant Apple ecosystemfunctions, policies and protections. These are not repeated here. 6.1.2 Section 4 of this Risk Assessment provides an overview of the way users can discoverand download apps from the App Store, as well as a high-level description of keycontrols that apply before an app is published. Section 5 of this Risk Assessment thenidentifies the way in which Systemic Risks might potentially crystallise in the App Store. 6.1.3 This Section provides more detail on key control functions and the risk mitigationmeasures that form part of the design or functioning of the App Store that operateto keep the App Store a safe and trusted place for all users. Apple considers that therisk mitigation measures detailed in this Section and elsewhere in the reportconstitute risk mitigation measures relevant to its obligation under Article 35 of theDSA to put in place reasonable, proportionate and effective risk mitigation measures. 6.1.4 The Section is structured as follows: (a) App Store Policies, Terms and Conditions that mitigate systemic risks; (b) Developer Due Diligence Measures; (c) App Review; (d) App Store and Privacy; (e) Recommender Systems Risk Mitigation Measures; (f) App Store User-Generated Content Measures; (g) App Store External Notice and Action Measures; and (h) New DSA Compliance function. 6.2 App Store Policies, Terms and Conditions that mitigate systemic risks 6.2.1 Pursuant to Article 34(2) of the DSA, Apple is required to assess how certain listedfactors influence the Systemic Risks. These factors include “the applicable terms andconditions and their enforcement”. An overview of App Store terms and conditionsand their enforcement is detailed below.(a) App Store consumer terms and conditions 6.2.2 Before an end user can use the App Store, they must agree to the Apple Media ServiceTerms and Conditions (the “AMS Terms”), 37 which govern the use by end users of theApp Store service. 37 https://www.apple.com/legal/internet-services/itunes/ie/terms.html Non-Confidential Version 50 6.2.3 Use of the App Store requires the creation of an Apple ID. Anyone 13 years of age orover, or the equivalent minimum age in their country or territory of residence, cancreate an Apple ID. 38 Apple IDs for individuals under this age can be created byparents or guardians using Family Sharing. 39 Apple recommends that parents or legalguardians creating an account for a minor should review the AMS Terms with theminor to ensure they understand it. 6.2.4 The AMS Terms contain Submission Guidelines that apply to user ratings and reviewson the App Store. The Guidelines prohibit various forms of misuse, including usingthe App Store to: (a) post any materials that (i) users do not have permission, right or licence to use,or (ii) infringe on the rights of any third party; (b) post objectionable, offensive, unlawful, deceptive, inaccurate or harmfulcontent; (c) post personal, private or confidential information belonging to others; (d) request personal information from a minor; (e) post, modify or remove a rating or review in exchange for any kind ofcompensation or incentive; (f) post a dishonest, abusive, harmful, misleading, or bad-faith rating or review, ora rating or review that is irrelevant to the content being reviewed; or (g) plan or engage in any illegal, fraudulent, or manipulative activity. 6.2.5 In addition, the AMS Terms detail various prohibitions including: manipulating playcounts, downloads, ratings or reviews via any means, including the use of bots, scripts,or automated processes, or providing or accepting any kind of compensation orincentive. Users who breach these requirements can be removed from the App Store. 6.2.6 The AMS Terms also explain that users can report use of the App Store that does notcomply with the Submission Guidelines via the “Report a Concern” function. 6.2.7 The AMS Terms also set out the requirements for “Family Sharing” accounts. The“family organizer” must be 18 (or an equivalent age of majority in their country or 38 When a user creates an Apple ID they are asked for their date of birth. If a user is below the relevantage, then a parent must create the Apple ID. As part of the process of creating an Apple ID for achild, parents will be asked to provide information required to create an account which may include:the child’s full name, date of birth, a password and a phone number. Where a parent is creatingan account for a child under the age of 13, Apple may require that the parent confirm a paymentmethod Apple already maintains for the parent. Beyond that, and in keeping with Apple’s approachto privacy, the principles of the UK Information Commissioner’s Office Children’s Code, and Article28(3) of the DSA, Apple collects as little information about children as possible. To that end, Appledoes not request proof of age and does not analyse biometrics or use other technologies to assessage.39 See paragraph 3.4.2et seq. Non-Confidential Version 51 territory of residence), and the parent or legal guardian of any users under age 13.The AMS Terms also explain how purchase sharing works, and the ways in whicheligible content is shared among members of a family, including the “Ask to Buy”feature. 6.2.8 The AMS Terms also make clear that the developer of any third-party app is solelyresponsible for its content, subject to local law. 6.2.9 The AMS Terms explain to users the factors that determine how results are presentedwhen they use the App Store search function, including metadata provided by theapp developer, user engagement with apps and the App Store, and an apps’popularity. 6.2.10 Finally, the AMS Terms also explain to users how they can contact Apple if they believethat content featured on the App Store infringes their copyright, with a separate linkand notice associated with third-party apps. 40 They also explain the steps Apple cantake against a user who is found to have repeatedly infringed the copyrights of others.The AMS Terms refer to redress options available to users who have been notifiedthat their reviews have been removed from the App Store.(b)App Store developer terms and conditions (i) Apple Developer Agreement 6.2.11 To get access to certain resources for learning how to develop apps, developers mustexecute the ADA. 41 The ADA contains the terms and conditions for registering withApple to become an Apple Developer and governs the use of the Apple Developerwebsite, beta software and events, and may include the opportunity to attend certainApple-provided technical talks and other events, including online or electronicbroadcasts of such events. It also addresses export controls, including prohibitionsagainst contracting with sanctioned individuals and entities. 6.2.12 If a developer breaches the terms of the ADA, Apple can at its discretion terminate orsuspend the developer. (ii) Apple Developer Program License Agreement 6.2.13 To enrol in the Apple Developer Program (a necessary step for developers wishing topublish apps on the App Store), developers must also execute the DPLA, enrolling asan individual or an organisation (e.g., company, non-profit, governmentorganisation).42 An individual or an authorised employee of an organisation must usean Apple device (while being logged into iCloud and using their Apple ID with two-factor authentication turned on) to log into the Apple Developer website or app,where they review and accept the DPLA. If they are logged into the Apple Developer 40 https://www.apple.com/legal/internet-services/itunes/appstorenotices/#?lang=en41 https://developer.apple.com/support/downloads/terms/apple-developer-agreement/Apple-Developer-Agreement-20230605-English.pdf 42 https://developer.apple.com/programs/apple-developer-program-license-agreement/ Non-Confidential Version 52 app, they must also verify their identity using a government-issued photo ID.Organisations must provide information about the organisation (for example, entitytype; legal entity name; D-U-N-S Number; headquarters address and phone number;website; and signing authority confirmation). 6.2.14 The DPLA grants a limited licence to developers to use certain Apple software andservices for app development; apps may be distributed through the App Store, orthrough other distribution channels (for example, Custom App Distribution toorganisational customers, ad hoc testing on registered devices, TestFlight for betatesting). Below is a summary of some relevant provisions: 6.2.15 Section 3.2 provides that developers will not use the Apple software or services,including the App Store, to: (a) engage in unlawful or illegal activity, nor to develop products which wouldcommit or facilitate the commission of a crime, or other tortious, unlawful orillegal acts; (b) threaten, incite or promote violence, terrorism or other serious harm; (c) create or distribute any content or activity that promotes child sexualexploitation or abuse; (d) violate, misappropriate or infringe proprietary or legal rights; (e) violate the security, integrity or availability of any user, network, computer orcommunications system; or (f) engage, or encourage others to engage, in any unlawful, unfair, misleading,fraudulent, improper or dishonest acts or business practices (for example,engaging in bait-and-switch pricing, consumer misrepresentation, deceptivebusiness practices, or unfair competition against other developers). 6.2.16 Section 3.3 provides that developers must, in the app description on the App Store,provide clear and complete information to users regarding their collection, use anddisclosure of user or device data. They are also required to take appropriate steps toprotect such data from unauthorised use, disclosure or access by third parties. Inaddition, developers must maintain a privacy policy, which details their collection, use,disclosure, sharing, retention, and deletion of user or device data, and must bepublished on its website with a link in the App Store. 6.2.17 Section 11.2 explains that Apple can terminate a DPLA with a given developer if thedeveloper: (a) violates the DPLA, including the terms listed above in Section 3.2; (b) becomes subject to sanctions or other restrictions in relevant regions; or (c) engages, or encourages others to engage, in any misleading, fraudulent,improper, unlawful or dishonest act, including misrepresenting the nature of anapp (for example, hiding or trying to hide functionality from Apple’s review,falsifying consumer reviews, or engaging in payment fraud). Non-Confidential Version 53 (iii) Schedules 1 and 2 to the DPLA 6.2.18 To distribute apps through the App Store, Apple Developers must accept the termsof Schedule 1 (for free apps) or Schedule 2 (paid apps or apps using Apple’s In-AppPurchase API) to the DPLA. 43 These Schedules appoint ADI as the commissionaire forthe marketing and end user download of apps distributed in the EU. The Schedulesalso contain requirements for the delivery of apps to Apple and end users; ownershipof apps and app information; end user licensing; content restrictions; and age ratings.In addition, Schedule 2 addresses commerce and tax issues. Below is a summary ofsome relevant provisions: 6.2.19 Section 2.4 provides that the developer is responsible for: (a) determining and implementing any age ratings or parental advisory warningsrequired by the applicable government regulations, ratings board(s), service(s),or other organisations for any content offered in their app; and (b) providing any content restriction tools or age verification functionality beforeenabling end users to access mature or otherwise regulated content within theirapp. 6.2.20 Section 5 requires developers to warrant and represent that: (a) their app does not (or permit users to) violate intellectual property orcontractual rights; (b) their app is authorised for distribution, sale and use in, export to, and importinto each of the regions designated; (c) their app does not contain any obscene, offensive or other materials prohibitedor restricted under the laws or regulations of any of the regions they designatefor distribution; (d) their app information is accurate; (e) they will provide correct and complete information about the content of theirapp in assigning an app rating; (f) their app shall not target children in any region where doing so is illegal; and (g) their app complies with all applicable laws where distributed, includingconsumer protection, marketing, and gaming laws. 6.2.21 Section 7.3 explains that Apple may cease the marketing and allowing download ofan app (for example, remove an app or terminate a developer) if the developer orapp: (a) is not authorised for export; (b) infringes intellectual property rights; 43 https://developer.apple.com/support/downloads/terms/schedules/Schedule-2-and-3-20220225-English.pdf Non-Confidential Version 54 (c) violates any applicable law; (d) violates the terms of the DPLA, Schedules to the DPLA, or the Guidelines; or (e) is subject to sanctions of any region in which Apple operates. 6.2.22 Revisions to the DPLA Schedules make reference to redress options available todevelopers who have been notified that that their app has been removed from theApp Store or that their developer account has been terminated. 6.2.23 As reported in Apple’s 2022 Transparency Report, Apple terminated 428,487developer accounts, the vast majority of which were due to non-compliance withSection 3.2(f) of the DPLA (which prohibits developers using Apple’s services toengage, or encourage others to engage, in any unlawful, unfair, misleading,fraudulent, improper, or dishonest acts or business practices, including bait-and-switch pricing, consumer misrepresentation, deceptive business practices, or unfaircompetition against other developers). Only 3,338 of those were appealed, and ofthose only 159 resulted in account restorations. 44 (iv) App Store Review Guidelines 6.2.24 All Apple Developers who want to distribute apps in the App Store must comply withthe Guidelines, which provide requirements for apps to be approved and remainavailable on the App Store.45 The five pillars of the Guidelines are Safety (Section 1),Performance (Section 2), Business (Section 3), Design (Section 4), and Legal (Section5). Overall, the Guidelines require that apps offered on the App Store are safe, providea good user experience, adhere to Apple’s rules on user privacy, secure devices frommalware and threats, and use approved business models. 6.2.25 All new apps and updates to existing apps are reviewed for compliance with theGuidelines. Specific provisions of the Guidelines are discussed in more detail belowin the section addressing App Review risk mitigation measures. 6.2.26 The Guidelines are subject to periodic review, updates, and additions, to account forthe needs of customers, developer innovation, changes in technology and law,ongoing App Review learnings, and developments in the App Store risk landscape.This offers opportunities to enhance the Guidelines and address risk generally,including the Systemic Risks. For example, Guideline 1.1.7, which prohibits harmfulconcepts which seek to profit from current events, including violent conflict, terroristattacks and epidemics, was put in place [CONFIDENTIAL]. 46 While Apple strives forcontinuity in the Guidelines, changes in developer practices, technology and risk aswell as the desire to provide transparency to developers require periodic updates ofthe Guidelines to be made. 44 https://www.apple.com/legal/more-resources/docs/2022-App-Store-Transparency-Report.pdf45 https://developer.apple.com/app-store/review/guidelines/46 https://developer.apple.com/news/?id=xk8d7p8c Non-Confidential Version 55 6.2.27 The App Store provides mechanisms for developers to submit feedback on theGuidelines via the “suggest a guideline” form, and such feedback is factored in whenthe Guidelines are under review.47 The App Review team compiles requests andsuggestions on modifications to the Guidelines, [CONFIDENTIAL]. Changes to theGuidelines have occurred annually, and sometimes multiple times a year. Once anupdate is made, the reviewed Guidelines are published online and developers arenotified of the update, both via email and a dedicated update published on Apple’s“News and Updates” area of the Apple Developer web site. 6.3 Developer Due Diligence Measures (a)Sanctions screening 6.3.1 Apple conducts sanctions screening for all developers who wish to join the AppleDeveloper Program. Developer names and contact details are run againstgovernment consolidated sanctions lists. Two types of sanctions screenings areconducted: One for individuals, based on information submitted in the DeveloperInformation Page, and one for organisations, based on information submitted in theEnrolment Information page of the enrolment. 6.3.2 Where a sanctions report contains a positive hit and the developer challenges apositive sanctions determination, the Global Export Sanctions Compliance team willseek more information from the developer. They then factor that additionalinformation into any final determination. 6.3.3 Apple also conducts ongoing sanctions monitoring to ensure that developers whoare already admitted to the Apple Developer Program have not been added to asanctions list.(b) Identity verification and screening 6.3.4 As explained above, individuals and organisations must sign in with an Apple ID withtwo-factor authentication, review and accept the latest terms of the Apple DeveloperAgreement,48 and enter identity information. If the developer is enrolling via theApple Developer app, they are asked to verify their identity with a driver’s licence orgovernment-issued photo ID. 6.3.5 Trust \& Safety Developer Fraud conducts identity verification and other risk-basedchecking, in order to identify developers which it considers may be unlikely to complywith the ADA and DPLA. Apple uses submitted developer data as a secure hash toscan for and block developers attempting to register multiple accounts. 47 https://developer.apple.com/app-store/review - see “Suggestions”.48 https://developer.apple.com/support/downloads/terms/apple-developer-agreement/Apple-Developer-Agreement-20230605-English.pdf Non-Confidential Version 56 6.3.6 The enrolment screening process helps Apple identify and therefore stop fraudulentor sanctioned actors whom Apple determines to be likely to develop and distributeapps that may contain illegal or harmful content from gaining access to the App Store. 6.4 App Review 6.4.1 Apps and app updates submitted to the App Store are uploaded through App StoreConnect, where developers create an app record, provide app metadata (includingapp binary), along with the app name and description and other relevant information.A complete set of metadata must be provided (i.e. if a submission includes“placeholder” text, it will be rejected). All such data relating to apps and app updatesare then reviewed by both automated tools and human app review specialists, bothof which are a critical component of App Review. 6.4.2 There are more than 100,000 app submissions in an average week. In 2022, AppReview reviewed 6,101,913 submissions (including app updates). The App Reviewteam rejected over 25 % of those submissions for various compliance issues, therebyserving an important function in mitigating risks, including potential Systemic Risks,in the App Store.49 (a) Automated review 6.4.3 Upon receipt of an app or app update, the App Review automated review processconducts a static binary analysis, asset analysis, and runtime analysis[CONFIDENTIAL] and analyse threats and signals (for example, the presence ofmalicious URLs or executable code, which for example could introduce or changefeatures or functionality of the app). The automated review process also conductschecks [CONFIDENTIAL], and cross-references apps and developers againstpreviously identified threats in the App Store ecosystem to better detect maliciousactors, fraud, and other abuses. 6.4.4 For over a decade, using proprietary machine learning tools and technologies, theApp Store has developed an internal corpus of information used to mitigate risks,such as previously identified threats, identified malicious apps and developers,suspicious keywords, malicious IP addresses and URLs. For example, malicious URLdetection involves analysing URLs that have been previously flagged for illegal orharmful content or characteristics. By analysing information in new app submissionsfor similarities with previously identified information, the automated reviewcomponent of the App Review process helps keep bad apps and actors from enteringor re-entering the App Store. 49 https://www.apple.com/legal/more-resources/docs/2022-App-Store-Transparency-Report.pdf.See also the supplemental data file at https://www.apple.com/legal/zip/2022-Supplemental-Data-File.zip. 50 [CONFIDENTIAL] . Non-Confidential Version 57 6.4.5 Similarly, automated review interprets cached text and images [CONFIDENTIAL] andidentifies potential threats like executable code, which could be used to change appfeatures or functionality after app review and approval. 6.4.6 The information gathered during automated review flags potential risks and providesuseful signals and information to human app reviewers to evaluate in more detail. Inaddition, such information is used to train the machine learning algorithms tocontinually improve detection and rejection of problematic apps. Finally, as explainedin more detail below, automated processes continue after approval of apps that areavailable on the App Store, with automated detection and escalation mechanismscontinuing to scan for potential threats. 6.4.7 Automated review capabilities are continually assessed for their performance andimproved. The App Review team works with engineering teams and domain expertsacross Apple to identify trends flagged by human app reviewers, investigate spikes inreports relating to specific issues (e.g. via Report a Problem), assess novel threats, andthe applicability of both established and emerging technologies to mitigate thesethreats. Multiple improvement efforts have historically been introduced each year.(b)Human review 6.4.8 The human review component of App Review is critical to the App Store’s mitigationand management of Systemic Risks. Every app and every app update undergoeshuman review, where trained app review specialists evaluate app features andfunctionality and signals provided by automated systems to screen out deceptive andabusive behaviour and ensure compliance with the Guidelines. 6.4.9 Human Review builds on and complements automated review, since human appreviewers are often better positioned than automated tools to identify apps that riskphysical harm, apps which are unreliable, or apps which otherwise pose concerns inways that are not readily apparent to automated (static and dynamic) tools. Asregards safeguarding user data and privacy, while the automated review will identifydata access entitlements and API calls, a human app reviewer is trained to assesswhether use of the entitlements and APIs are appropriate for the app’s functionality.For example, a human app reviewer will likely decide that a calculator app does notneed to request access to data and functionality like photos or the microphone.Similarly, app reviewers are trained to evaluate whether an app age rating isappropriate given the app’s content and functionality, as well as whether apps withuser-generated content have sufficient content moderation mechanisms to protectchildren or mitigate risks related to offensive content, harmful concepts, or publicsecurity. 6.4.10 The App Store review process is carried out by over 500 human app review experts,including over 170 individuals based in the EU, representing 81 languages acrossthree time zones. 6.4.11 Prior to reviewing any apps, new employees receive four to six weeks of intensivetraining regarding, inter alia, all components of the Guidelines, including screening Non-Confidential Version 58 for privacy and data issues, particularly for children; objectionable content; apps withuser-generated content; and legal considerations. 6.4.12 The App Review teams are educated on potential legal issues and risks – includinghighly sensitive topics such as CSAM, real money gambling, illegal content,suppression of human rights, and misleading public health information – and theappropriate escalation paths. Apps are assigned to individuals for review based ontheir skills, qualifications and experience, including language capabilities, culturalsensitivities, and specialised training. 6.4.13 After initial training, new App Review personnel work is monitored and audited, andthey receive regular performance feedback and specialised training, as appropriate.All app reviewers have ongoing support and internal resources, such as mentoring,coaching, access to app review processes and policies, and weekly and ad hocmeetings with managers. The work of human reviewers is audited and new andemerging issues feed into guidance updates and learning resources. The App Reviewteam also monitors customer and developer feedback to assess performance.Additionally, the App Review Business Excellence team performs quality control andaudit to conduct root-cause analysis and make necessary improvements, whether totools or performance management of reviewers. 6.4.14 The diverse App Review team tracks evolving risks in the EU and around the world,based on trends, language cues, global events, and other signals, all of which is usedto continually update and train the automated and human review functions. Appreviewers are kept up to date regarding new and evolving risks via coaching, accessto practices and policies, and meetings referred to in paragraph 6.4.13 above. 6.4.15 When App Review discovers apps that contain illegal content, fraudulent or maliciouscontent or behaviour, it adjusts the review process to prevent such apps from beingapproved in the future. If Apple discovers apps that have not circumvented the AppStore review process per se but that are exhibiting malicious or user-unfriendlybehaviours after installation, Apple similarly adjusts its processes to prevent this fromreoccurring. If Apple discovers new malware on its platforms, it adjusts its custom-written malware scanners to scan apps already on the App Store and detect suchmalware in the future.(c) General review practices (i) App Review Guidelines 6.4.16 The Guidelines are the cornerstone of the App Review process. The preamble to theGuidelines notes that the guiding principle of the App Store is to provide a safeexperience for users to get apps and a great opportunity for all developers to besuccessful. The App Review team evaluates all new apps and app updates to ensurecompliance with the Guidelines. 6.4.17 Through application and enforcement of the Guidelines, the App Store aims to limitpotential risks, including the Systemic Risks within its control. While Apple is unableto monitor or prevent content hosted within third-party apps, the Guidelines provide Non-Confidential Version 59 detailed, comprehensive and relevant requirements regarding developer’s own riskmitigation responsibilities. 6.4.18 Particularly relevant to the DSA are Guidelines that: (a) Prohibit objectionable content; (b) Contain specific rules for apps with UGC; (c) Contain specific rules for apps in the Kids category; (d) Require developers to set appropriate age ratings; and (e) Require compliance with privacy, intellectual property, consumer protectionand all other applicable laws, including the U.S. Federal Children’s OnlinePrivacy Protection Rule (“COPPA”) and GDPR. 6.4.19 Below are summaries of some of these important Guidelines that play an importantrole in the App Store’s risk mitigation measures. (ii) Section 1: Specific app review practices for “Safety” 6.4.20 Section 1 of the Guidelines on Safety states that users expect to feel safe in installingan app from the App Store, and need to have confidence that the app will not containupsetting or offensive content, damage their device, or cause physical harm. 6.4.21 In 2022, 92,598 apps were rejected for non-compliance with Section 1 of theGuidelines.51 (A) Objectionable content 6.4.22 Section 1.1 (Objectionable content) states that “Apps should not include content thatis offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste.”Among other things, this section prohibits apps that contain: (a) defamatory, discriminatory, or mean-spirited content; (b) portrayals of people being killed, tortured, or abused; (c) content that encourages violence, or illegal or reckless use of weapons; (d) overtly sexual or pornographic material. This includes apps that may includepornography or be used to facilitate prostitution, or human trafficking andexploitation; or (e) harmful concepts which capitalise on current events. (B) User-generated content 6.4.23 Section 1.2 (User-generated content) states that apps with UGC present particularchallenges, ranging from intellectual property infringement to anonymous bullying.To prevent abuse, apps with UGC or social networking services must include: 51 App submissions may be rejected for non-compliance with one or more Guidelines. Non-Confidential Version 60 (a) a method for filtering objectionable material from being posted to the app; (b) a mechanism to report offensive content and timely responses to concerns; (c) the ability to block abusive users from the service; and (d) published developer contact information. 6.4.24 Section 1.2 also provides that apps with UGC or services that end up being usedprimarily for pornographic content, Chatroulette-style experiences, objectification ofreal people (for example “hot-or-not” voting), making physical threats, or bullying donot belong on the App Store and may be removed without notice. (C) Kids category52 6.4.25 Section 1.3 (Kids category) provides that apps in the “Kids” category must not includelinks out of the app, purchasing opportunities, or other distractions to kids unlessreserved for a designated area behind a “parental gate”. 53 In addition to complyingwith privacy laws applicable to children, Kids Category apps may not send personallyidentifiable information or device information to third parties and should not includethird-party analytics or third-party advertising. In limited cases, third-party analyticsmay be permitted provided that the services do not collect or transmit any identifiableinformation about children (such as name, date of birth, email address), their location,or their devices. Any third-party contextual advertising services in Kids category appsmust have publicly documented practices and policies for Kids Category apps thatinclude human review of ad content for age appropriateness (and a link must beprovided to such policies and practices when the app is submitted for App Review). (D) Physical harm 6.4.26 Section 1.4 (Physical harm) warns that apps that present risks of physical harm maybe rejected and, for example, prohibits apps that encourage: (a) Consumption of tobacco and vape products, illegal drugs, or excessive amountsof alcohol; (b) Drink-driving or other reckless behavior, such as excessive speed; or (c) Use of devices in a way that risks physical harm to users or others. (iii) Section 2: Specific app review practices for “Performance” 6.4.27 Section 2.3 requires developers to ensure that all app metadata, including privacyinformation, their app description, screenshots, and previews accurately reflect theapp’s core experience. 52 The Kids category on the App Store are apps specifically designed for children ages 11 and under.Developers places their apps in one of three age bands based on its primary audience: 5 and under,6 to 8, or 9 to 11.53 A parental gate presents an adult-level task that must be completed in order to continue. The AppStore provides developers with guidance regarding the creation of parental gates here:https://developer.apple.com/app-store/kids-apps/ Non-Confidential Version 61 6.4.28 Section 2.3.8 requires all app metadata, including apps and in-app purchase icons,screenshots, and previews to adhere to a 4+ age rating, even if the app is rated higher.By way of example, even if a developer’s game includes violence, images on the AppStore should not depict a gruesome death or a gun pointed at a specific character. (iv) Section 5: Specific app review practices for “Legal” 6.4.29 Section 5 of the Guidelines states that apps must comply with all legal requirementsin any location where developers make them available, and specifies that thedeveloper is responsible for understanding and ensuring their app conforms with alllocal laws. In addition, Section 5 states apps that solicit, promote, or encouragecriminal or clearly reckless behaviour are unacceptable, and warns that in extremecases, such as apps that are found to facilitate human trafficking and / or theexploitation of children, the appropriate authorities will be notified. 6.4.30 In 2022, 441,972 apps / app updates were rejected for non-compliance with Section5 of the Guidelines. (A) Privacy 6.4.31 Section 5.1 (Privacy) states that protecting user privacy is paramount in the Appleecosystem, and developers must be careful when handling personal data to ensurecompliance with, among other things, privacy best practices, applicable laws, theterms of the DPLA, and customer expectations. (B) Data practices 6.4.32 Section 5.1.1 (Data Collection \& Storage) provides that all apps must: (a) include a link to their privacy policy, which must comply with Section 5.1, in aneasily accessible manner; (b) secure user consent for the collection of user or usage data; (c) provide an easily accessible and understandable way to withdraw consent; (d) only request access to data relevant to the core functionality of the app; (e) respect user permission settings; (f) allow app use without a login if the app doesn’t rely on account-based features;and (g) not compile personal information without the user’s explicit consent. 6.4.33 Section 5.1.2 (Data Use \& Sharing) further requires that, unless explicitly permitted bylaw, all apps must: (a) not use, transmit, or share someone’s personal data without first obtaining theirpermission; (b) obtain explicit permission via the App Tracking Transparency APIs to track theiractivity; Non-Confidential Version 62 (c) not repurpose data collected for a different purpose without additional userconsent; and (d) not attempt to secretly build a user profile based on collected data. (C) Health 6.4.34 Section 5.1.3 (Health and Health Research) states that health, fitness, and medical dataare especially sensitive and sets out additional rules for apps with such a focus. (D) Kids 6.4.35 Section 5.1.4 (Kids) has additional privacy and data requirements for children: (a) apps must comply with all children data protection laws (for example, COPPAand GDPR); (b) apps should not include third-party analytics / advertising if intended for kids; (c) use of terms like “For Kids” and “For Children” is reserved for the Kids Category;and (d) apps not in the Kids Category cannot imply the app is for children. (E) Location services 6.4.36 Section 5.1.5 (Location Services) provides that use of location services in an app areonly appropriate if: (a) directly relevant to the features and services provided by the app; (b) the purpose of location services has been explained to the user; and (c) the user has been notified and provided consent before the collection,transmission, or use of any location data. (F) Intellectual property 6.4.37 Section 5.2 (Intellectual Property) requires developers to only include content in theirapp if they own it or are licensed or otherwise have permission to use it, and directsdevelopers who believe that their intellectual property rights have been infringed byanother developer on the App Store to submit a claim via the App Store ContentDispute web form. 54 If the app features third-party trademarks or copyrightedcontent or lets users stream or download third-party content, the developer mustprovide with its app submission its authorisation to use such content.55 (G) Gaming, Gambling and Lotteries 6.4.38 Section 5.3 (Gaming, Gambling, and Lotteries) states that developers must fully vettheir legal obligations everywhere their app is available. Among other requirements,apps used in connection with real money gaming or lotteries: 54 https://www.apple.com/legal/internet-services/itunes/appstorenotices/#?lang=en55 https://developer.apple.com/app-store/review/ Non-Confidential Version 63 (a) cannot use in-app purchase to purchase credit or currency; (b) must have necessary licensing and permissions where the app is used; (c) must be geo-restricted to those locations; and (d) must be free on the App Store. (H) Developer Code of Conduct 6.4.39 Section 5.6 contains the Developer Code of Conduct. It requires developers to treateveryone with respect, including in responses to App Store reviews, customer supportrequests and in dealings with Apple. The Code of Conduct prohibits harassment,discriminatory practices, intimidation, and bullying. Repeated manipulative,misleading, or fraudulent behaviour will result in removal from the Apple DeveloperProgram. It further states that apps should never attempt to “rip off” customers, trickthem into making unwanted purchases, force them to share unnecessary data, orengage in manipulative practices within or outside of the app. The Code of ConductSection also states that: (a) developer and app information must be truthful, relevant, and current; (b) manipulating the customer experience (for example, charts, search, reviews, orapp referrals) is not permitted; and (c) indications that customer expectations are not being met (for example,excessive customer complaints, negative reviews, and excessive refund requests)may result in termination.(d) App review escalations and new and emerging issues 6.4.40 During the App Review process, app reviewers may escalate issues to App Reviewspecialist teams or other functional groups, as needed, to provide input, to work withdevelopers on compliance issues, or to take action against problematic apps. Newand emerging issues are often escalated in order to seek guidance on the appropriatepath forward, including for example in response to specific events, such as[CONFIDENTIAL] new technologies [CONFIDENTIAL]. Below are the key groupsinvolved in app escalations. (i) App Review Compliance 6.4.41 This team tracks trends of misleading app concepts and signals, as well as app spamissues. An app reviewer may escalate an app to this team to investigate appbehaviour, including whether behaviour has changed since an initial review, todetermine whether the app exhibits fraudulent or misleading functionality, or todetermine whether developer-hosted content violates the Guidelines. If there is aproblem, this team will work with the developer to bring the app into compliance orremove the app from the App Store, if appropriate. Non-Confidential Version 64 (ii) App Store Improvements/Technical Investigations 6.4.42 If an app reviewer identifies a need for a deeper analysis of the technical functionalityof an app, they will escalate the issue to Technical Investigations. For example, thisteam investigates whether an app uses private APIs that may violate the Guidelines’privacy and data collection requirements. Based on the results of a TechnicalInvestigation, the app reviewer may reject the app. Additionally, learnings collectedduring these investigations are applied to help develop and refine automated reviewtools, to determine if existing and future app submissions contain similar issues. (iii) App Review Policy 6.4.43 If an app presents a new or unique issue that requires policy or Guidelineinterpretation, an app reviewer will escalate that issue to the App Review Policy team.This team investigates novel apps, evolving technologies, and current trends in apps,as well as highly sensitive and legal issues. This team regularly works with and seeksadvice from other functional groups, [CONFIDENTIAL]. The App Review Policy teammeets on a weekly basis and as needed to consider app policy escalations. The AppReview Policy team drives the evolution of App Review’s policy enforcement effortsand informs the ongoing development of internal policies and updates to theGuidelines. (iv) Legal, privacy, government affairs, child safety, global security investigations \& regional experts 6.4.44 As explained above, the App Review teams are educated on potential legal issues andrisks, including on topics such as CSAM, illegal content, suppression of human rights,and misleading public health information. On a daily basis, App Review escalates appissues to senior management in App Review and the App Store Legal team. The AppStore Legal team provides legal advice and coordinates with various other internallegal and regulatory teams (including EU-based teams) across Apple (for example,Privacy Compliance, Privacy Legal, EU Regulatory Legal, Human Rights, Child Safety,Global Security), as well as external counsel, for input and advice on complex issuespresented by apps. (v) ERB 6.4.45 The ERB is composed of senior leaders who have ultimate decision-makingresponsibility regarding access for apps to the App Store. The ERB meets regularlyand receives updates and management information from various App Store functions,including App Review and App Store Legal. These updates detail informationregarding App Review processing times and approval/rejection information, and newand emerging issues, including new and novel types of apps. 6.4.46 Where escalation issues cannot be resolved by the App Review team or the App StoreLegal team, they are escalated to ERB. The ERB will then decide next steps, includingapp takedowns, further engagement, or an exploration of viable alternatives, asappropriate. Non-Confidential Version 65 (e)App review rejections, suspensions, terminations, appeals 6.4.47 The underlying philosophy of the App Review team is to work with developers toensure apps are compliant with the Guidelines, as well as local legal and regulatoryrequirements. 6.4.48 If an app under review is in violation of the Guidelines, the team may reach out to thedeveloper to work with them on remediation, unless for example the app is clearlyfraudulent. If the app is rejected, the developer receives a message describing thereasons for an app rejection. The message identifies the Guideline that the appviolates, describes the ways in which the guideline has been violated and providesnext steps to help resolve the rejection, including access to additional resources.Developers may also request a call to discuss issues with an App Review specialist. 6.4.49 The App Review team may, depending on the severity of the issue, afford thedeveloper 14 to 30 days to rectify an objectionable content issue (for example, bycontent-takedowns, or user blocking) before removing the app or taking additionalmeasures. They may also require the developer to update their content moderationplan and confirm mitigation measures are in place to avoid recurring issues. 6.4.50 Developers can respond to the reviewer with a request for additional information orfurther discussion of the issues, or may dispute the findings. 6.4.51 App removals and developer terminations are the most severe measure to beundertaken in circumstances where remediation attempts have failed or are not anoption, such as in circumstances where the app is fraudulent, or facilitates illegalactivity. 6.4.52 As explained in the “After You Submit” section of the Guidelines, developers candispute decisions of App Review regarding app rejections or developer terminations,via an appeals process, which is overseen by the App Review Board (the “ARB”). 56 TheARB is composed of experienced App Review specialists who investigate claimsasserted in an appeal, the history of the app and interactions with the developer, andseek input from specialised functions where appropriate. 6.4.53 Very few appeals are sustained, which tends to confirm the robust nature of appremoval and developer termination decisions. For example, in 2022, Apple removed186,195 apps from the App Store. Only 18,412 of those decisions were appealed, and616 resulted in the app being restored. 57 Similarly, 428,487 developer accounts wereterminated. Only 3,338 developer account terminations were appealed and, of those,159 resulted in a restoration. 58 56 https://developer.apple.com/app-store/review/ - see “Appeals”. This page includes a link to a formfor developers to submit appeals.57 As noted in the 2022 Transparency Report, most app removals that are appealed are removed fromthe App Store due to illegality or fraud. Consequently, most appeals from developers of such appsare rejected. 58 https://www.apple.com/legal/more-resources/docs/2022-App-Store-Transparency-Report.pdf Non-Confidential Version 66 (f)Ongoing monitoring 6.4.54 The App Review process does not stop once an app is approved and published onthe App Store. This is necessary for a number of reasons: (a) Initial automated and human review cannot be expected to have a 100%success rate. Problematic app developers go to great effort to hide maliciousfunctionality in their apps. As a result, sometimes malicious apps are publishedon the App Store, despite Apple’s extensive risk mitigation measures. (b) Many apps contain content that changes over time. Developers of fraudulentapps sometimes introduce a switching mechanism that makes the app appearbenign (like a simple game) during initial review but contains a trigger that canbe switched post-approval to serve illicit or fraudulent content (i.e. “bait-and-switch”). In 2022, Apple blocked or removed 23,823 apps for bait-and-switchtactics. (c) An approved app may also be found to have misrepresented its privacy policiesand be illegally using personal information. An app might also evolve into athreat not inherent to its design. For example, a simple message board appthat appears harmless on its face during App Review might later be used forillegal purposes. 6.4.55 Ongoing App Review through automated scans and other threat detection toolsaddress the impact of a threat discovered post-approval. These tools help ensurethat Apple can identify the developer, track malicious patterns by the same developer,identify similar patterns presented by other apps, and cut off distribution at a singlesource. Apple can directly communicate with the app developer and rapidly removethe app from the App Store if necessary. 6.5 App Store and Privacy 6.5.1 Pursuant to Article 34(2) of the DSA, Apple is required to assess how its “data relatedpractices ” influence the Systemic Risks. An overview of relevant practices and controlsis detailed below.(a) App Store \& Privacy Notice 6.5.2 When first interacting with the App Store, users are presented with service-specificprivacy information, in the form of the App Store \& Privacy Notice.59 This ensuresthat users have an effective choice and any consent to data use on Apple products isfully informed. 6.5.3 Also presented to users at this time is Apple’s Data \& Privacy Icon, which links to moredetailed on-screen information and more detailed service-specific privacyinformation regarding the App Store’s privacy practices. This provides users withtransparent and easily accessible information that details how Apple collects,processes and discloses their personal data. 59 https://www.apple.com/legal/privacy/data/en/app-store/ Non-Confidential Version 67 6.5.4 The App Store uses, inter alia, local, on-device processing to enhance itsrecommendations and mitigate privacy risks. In addition, using data such as appinstalls – the App Store can suggest apps and in-app events that are more relevantto users. These recommendation systems are described from paragraph 6.6 below. 6.5.5 The App Store \& Privacy Notice also explains how users can turn off personalisationfeatures. Personalisation is also described in further detail from paragraph 6.6.32below. 6.5.6 When a user uses a payment card in the App Store, Apple may obtain informationfrom the financial institution or payment network, and also use it for fraud preventionand verification.(b)Privacy Nutrition Labels 6.5.7 Product pages in the App Store feature a section that includes summaries prepared by developers of their key privacy practices in a simple, easy‑to‑read label, which informs the user about the app’s privacy practices before downloading it. Theselabels show how developers are collecting and using user data, such as a user location,browsing history, and contacts. 6.5.8 The same applies to Apple’s own apps.60 Privacy nutrition labels are an innovativeand easily understandable feature which makes use of clear language andimages/icons to explain how data is used. 60 https://www.apple.com/privacy/labels/ Non-Confidential Version 68 (c)App Privacy Report 6.5.9 The App Privacy Report, accessible via a user’s Settings, records data on device andsensor access, app and website network activity, and the most frequently contacteddomains in an encrypted form on user devices.61 Via this report, users are able to seehow often their location, photos, camera, microphone, and contacts have beenaccessed by apps during the last seven days, and which domains those apps havecontacted. Users therefore have full and easy visibility into the ways apps use theprivacy permissions a user has granted them, as well as their respective networkactivity. Together with Privacy Nutrition Labels, this feature provides users withtransparent information about how the apps made available on the App Store treatuser privacy.(d)App Tracking Transparency Framework 6.5.10 If a developer wants to track a user across apps and websites or access their device’sdata for advertising purposes, they must seek the user’s permission through the AppTracking Transparency Framework. This applies across all apps available on the AppStore. Tracking in this instance refers to linking user or device data collected from anapp with user or device data collected from other companies’ apps, websites, oroffline properties for targeted advertising or advertising measurement purposes.Tracking also refers to sharing user or device data with data brokers. If the user hasnot granted permission to this tracking, the relevant app will not be able to accessany user data. 6.5.11 An app tracking section in Settings lets users easily see which of their apps have beengiven permission to track, so they can change their preferences and disable apps fromasking in the future.(e)Access Permissions and App Sandbox 6.5.12 Apps may request access to features such as a user’s location, contacts, calendars, orphotos. The App Sandbox protects user data by limiting access to resourcesrequested through entitlements. Users receive a prompt with an explanation the firsttime an app wants to use this data, allowing them to make an informed decisionabout granting permission. Developers are required to get permission from users,with a simple, clearly understandable, and prominently placed means before trackingthem or tracking their devices across apps and websites owned by other companiesfor ad targeting, for ad measurement purposes, or to share data with data brokers.Even if a user grants access once, they can change their preferences in Settings at anytime. In addition, no app can access the microphone or camera without the user’spermission. When an app uses the microphone or camera, the user’s device displaysan indicator to let the user know it is being used – whether the user is in the app, inanother app, or on the Home Screen. In addition, the Control Center on a user’sdevice shows the user if an app has recently used the microphone or camera. 61 https://support.apple.com/en-us/HT212958 Non-Confidential Version 69 6.5.13 The App Sandbox provides protection to system resources and user data by limitinga developer’s app’s access to resources requested through entitlements. This createssecure silos to protect the data of end users across the device. 6.6 Recommender Systems Risk Mitigation Measures 6.6.1 Pursuant to Article 34(2) of the DSA, Apple is required to assess how its“recommender systems and any other relevant algorithmic systems ” influence theSystemic Risks. An overview of App Store recommender systems, and the searchfunction is detailed below. 6.6.2 As explained in Section 4 above, users can discover apps available in the App Storethrough five tabs: Today, Games, Apps, Arcade, and Search. The apps that aredisplayed in these tabs appear organically (for example, various categories of “Top”charts ) in all tabs except Search; as “recommendations” in the form of algorithmicallyselected recommendations or editorially curated recommendations in all tabs; as asearch result in the Search tab; or as an Apple Search Ad in the Today or Search tabs.App recommendations may also be personalised based on a user’s demographic, aswell as App Store purchase and download history. Notably, all apps appearing in theApp Store, including recommendations, have already undergone the rigour of theApp Review process and have been approved for publication in the App Store.(a) Algorithmically Selected App Recommendations 6.6.3 Apple maintains an app repository that describes various attributes of apps duringtheir lifecycle in the App Store. For example, the app repository includes standardapp information and metadata supplied by the developer, such as the name of theapp and developer, when the app was released, the app categories, and the app’s agerating. It also includes information about the app’s popularity, including statistics onapp downloads and transactions; aggregate and anonymised user engagementsignals, such as browse and search activity; and fraud trust signals. [CONFIDENTIAL]. 6.6.4 Whether an app appears in recommendations depends on machine learningalgorithms that interpret information from the app repository related to: (i) appquality; (ii) app popularity; (iii) app sensitivities; and (iv) the context of therecommendation. 6.6.5 Not all apps may appear as recommendations. [CONFIDENTIAL] For example, if theApp Store becomes aware of violations of the Guidelines, the app may be removedfrom recommendations until the app becomes compliant. [CONFIDENTIAL]. Non-Confidential Version 70 (b)Editorially Curated App Recommendations 6.6.6 The App Store Editorial team uses apps from the app repository to curate its ownunique app recommendations. Factors that App Store editors consider whenconsidering recommendations include: (i) user interface design: the usability, appeal,and overall quality of the app; (ii) user experience: the efficiency and functionality ofthe app; (iii) innovation: apps that solve a unique problem for customers; (iv)localisations: high quality and relevant; (v) accessibility: well-integrated features; (vi)App Store product page: compelling screenshots, app previews, and descriptions; and(vii) uniqueness. 6.6.7 For games, editors also consider: (i) gameplay and level of engagement; (ii) graphicsand performance; (iii) audio; (iv) narrative and story depth; (v) ability to replay; and (vi)gameplay controls. 6.6.8 The Editorial team creates a curated catalogue of apps for each category used in thevarious tabs (for example, original stories, tips, how-to guides, interviews, App of theDay, a Game of the Day, Now Trending, Collections, Our Favorites, Get Started). Foreach curated category, the Editorial team determines whether to pin certaincategories in designated vertical positions of tabs. They can also choose topersonalise categories, as described below. If a story has been personalised, thecurated category would surface and order stories that are most relevant based on auser’s purchase and download history. 6.6.9 The Editorial team maintains and updates curation guidelines, which identify appsthat are “not recommendable” (despite having been through App Review) and localsensitivities, for editors to reference. The curation guidelines have been distilled intobest practices, which are publicly available to help developers understand what theApp Store finds valuable in curation for users.62 (c)App Store Search Results function 6.6.10 Within the Search tab, users can use the “search” function to search for games, appsand Stories. This search function is designed to help users find the apps they arelooking for as efficiently as possible. 6.6.11 Users can search in one of the 40 languages available on the App Store. When a userstarts typing a search word they are presented with a number of suggested terms ina list, before they hit the “search” button to action the search. These suggested termsare selected by algorithm. The dominant factor that determines these suggestedterms is based on prior aggregate user search behaviour in the storefront in whichthe user is searching. This user behaviour is tracked on an anonymised basis and notper individual user. If there are few prior searches similar to what a user has startedtyping, another algorithm will suggest terms based on app name-matching. 62 https://developer.apple.com/app-store/discoverability/ Non-Confidential Version 71 6.6.12 When a user clicks on “search” they are presented with search results. These searchresults are unique to the App Store storefront associated with the user’s account.Search results are determined by an algorithm, which determines results based on anumber of factors, including: (a) text relevance (for example using an accurate app title), relevant keywords /metadata, and category of app a user has searched for (for example games); (b) signals associated with aggregated user behaviour, including app searches anddownloads, number and quality of ratings and reviews and app downloads inthe storefront the user is searching in; and (c) date of launch in the App Store. 6.6.13 When an app is new and does not have significant numbers of searches or user signalsassociated with it, it is automatically boosted by the search results algorithm. Oncethe app has sufficient exposure in the search function, and the algorithm has collectedsufficient signals regarding its popularity / quality, the boost is removed. 6.6.14 In limited circumstances, Apple may manually override results by removing or addinga given app listing from the search results. For example, if a developer adds keywordsto their listing attempting to rank in queries for which they are not relevant, Applecan remove their result for that search query. 6.6.15 Apple applies the same search algorithm, applying the same factors, to its own appsas it does to third-party apps. 6.6.16 Search results are not personalised. However, some personalisation of thepresentation of the results may occur on-device, for example if a user searches for anapp that they have already downloaded to their device. In such instances, the searchresults may include product information about the already downloaded app in a morecondensed form.(d)Apple Search Ads 6.6.17 Apple Search Ads is a service by which developers can pay for promoted placementsof their apps in the App Store. 6.6.18 Within the App Store, Apple Search Ads appear in the Today tab, the Search tab andSearch results, and in app product pages users access while browsing. Thesepromoted app placements appear on the App Store itself and are distinct from andunrelated to the third-party advertisements that may be shown within an app, forwhich the developer, and not Apple, is responsible. 6.6.19 Apple Search Ads only feature apps already available in the App Store in the subjectcountry or region. 6.6.20 With Apple Search Ads, it is made clear to users that they are seeing a promoted appplacement (as opposed to an editorial / organic placement) through clear andconspicuous visual cues intended to make a clear distinction between promoted appplacement and organic content. All such promoted app placements include a Non-Confidential Version 72 prominent “Ad” mark, and may include border and background shadingdemarcations. Moreover, the ”Ad” mark is interactive; when a user taps on it, theysee an ”About this Ad” sheet, which explains why they are seeing that particular appand what criteria, if any, were used to display the relevant app campaign. If a userclicks on the promoted app, they are taken to the app product page. 6.6.21 Apple Search Ads determines which apps get promoted placement via a bid auctionmechanism: advertisers pay only what they are willing to pay in a competitive auctionmarketplace, based on their individual preferences, including bids for actions like tapsor installs. 6.6.22 All developers who promote their apps using Apple Search Ads must contractuallycommit that their promoted apps will comply with all applicable laws and regulations. 6.6.23 Apple takes several measures to address risk relating to Apple-delivered promotedapp placement on the App Store. For example, in addition to the actions performedby the App Review team to review and approve apps for distribution on the App Store,the Apple Search Ads team additionally reviews promoted app placement for content,imagery, and promotion category classification. Apple Search Ads policies prohibitcertain categories of apps from being promoted on the App Store – either altogether,in certain countries or regions, or in certain App Store placements. 63 Moreover, somecategories of apps that are not prohibited may still face promotion restrictions asmanaged by the Apple Search Ads team – for example, submitting proof of specificpermits or licences to Apple as a prerequisite to advertising, including the promotionof apps, in certain countries or regions. 6.6.24 Additionally, the Apple Search Ads team routinely monitors account and advertiseractions for signs of potential misconduct and handles complaints relating to AppleSearch Ads advertising. 6.6.25 Apple Search Ads is engineered to facilitate promoted app placements in a mannerthat ensures that the App Store does not know which promotional app has beensurfaced to a user, or whether an identifiable user has viewed or clicked on it. 6.6.26 Apple creates “segments” to deliver personalised Apple Search Ads on the App Store.Segments are groups of people who share similar characteristics. Information abouta user may be used to determine which segments they are assigned to, and thus,which Apple Search Ads they receive. To protect user privacy, personalised AppleSearch Ads are delivered only if more than 5,000 people meet the targeting criteriaselected by an advertiser. 6.6.27 Information to assign a user to segments is strictly limited and includes accountinformation (for example, name, address, age, gender), downloads, purchases andsubscriptions records on the App Store. When selecting which Apple Search Ad todisplay from multiple ads for which a user is eligible, Apple may use some of thisinformation, as well as App Store searches and browsing activity, to determine which 63 https://searchads.apple.com/policies/ Non-Confidential Version 73 ad is likely to be most relevant. This information is aggregated across users so that itdoes not identify any single user. 6.6.28 Pursuant to its obligation under Article 39 of the DSA, Apple has created a publiconline repository of apps promoted as Apple Search Ads.64 The repository sets outinformation about each app presented as an Apple Search Ad to consumers withinthe EU, including what content was presented where, and when. The repository isdesigned to contain this information for the period that the Apple Search Ad unit islive, and for one year from the date of its last impression. For content that is restricteddue to alleged illegality, a governmental order, or incompatibility with applicableterms and conditions, the repository is designed to record the restriction as well asthe grounds for the restriction. The repository is accessible and can be queriedthrough a dedicated website. An API is also available for large volume queries. 6.6.29 Apple Search Ads is built with strong limitations to protect children and minors: (a) For a minor under 18 (or the age of majority in the relevant jurisdiction) who islogged in with their Apple ID, the Personalised Ads setting is automatically setto “off” and cannot be enabled until the user reaches the age of majority. WithPersonalised Ads set to off, Apple cannot use account information (for example,name, address, age, gender), apps downloads, or in-app purchases andsubscriptions, for serving Apple Search Apple Search Ads in the App Store. (b) When a user turns 18 (or the relevant age of majority), the App Store app willdisplay a prompt to allow the user to choose whether or not to agree to receivepersonalised Apple Search Ads on the App Store. 6.6.30 Furthermore, as explained in Section 4 above, each app has an age rating. These ageratings, and the age of the user, determine whether, and if so, which Apple SearchAds will be displayed to users under 18 years of age, subject always to the followinglimitations: (a) Apple Search Ads are not presented to users under the age of 13; (b) All apps rated 17+ are not presented to users under 18 as Apple Search Ads;and (c) Certain categories of apps, irrespective of age rating, are not presented to usersunder 18 as Apple Search Ads. 6.6.31 For users over 18, it is the developer’s responsibility to configure minimum agetargeting to local law requirements.(e) Personalisation 6.6.32 Personalised Recommendations are not available for minors, managed accounts andaccounts that have opted out of personalised recommendations. 64 https://adrepository.apple.com/ Non-Confidential Version 74 6.6.33 For a child account, i.e. registered via Family Sharing and under 13 (or the minimumage of lawful consent in the relevant jurisdiction in application of Article 8 of theGDPR), the Apple ID is not eligible to receive any personalised recommendations inthe App Store. 6.6.34 Users can change the Personalised Recommendations setting for their Apple ID goingto iOS Settings \> [user name], tapping Media \& Purchases, tapping View Account,and then toggling Personalised Recommendations on or off. Users can also learnmore about which information is used to personalise the recommendations made tothem (for example information about purchases, downloads, and other activities inthe App Store). 6.6.35 If Personalised Recommendations is turned on, user interactions within the App Storemay be used to personalise app recommendations and editorial content. For example,the App Store Today tab will recommend content that may be of interest to the userbased on what they have previously searched for, viewed, downloaded, updated, orreviewed in the App Store. Recommendations are also based on user purchasehistory, including in-app purchases, subscriptions, and payment methods togetherwith account information derived from the user’s Apple ID. 6.6.36 In addition, personalised recommendations are based on aggregate informationabout app launches, installs, and deletions from users who choose to share deviceanalytics with Apple, and aggregate information about app ratings. 6.6.37 If Personalised Recommendations is turned off, a user will not receive personalisedrecommendations or editorial content. Instead, recommendations from the apprepository will display apps without reference to the user’s engagement with the AppStore.(f)Mitigating potential third-party abuses 6.6.38 The Trust and Safety Operations team is responsible for “live moderation” of AppStore hosted UGC and protecting App Store discovery features, including charts andsearch, from fraudulent behaviour, including the behaviour of “bots”. Inauthenticratings and reviews from fraudulent or bot accounts can mislead users intodownloading an untrustworthy app that attempts to game the system throughmisrepresentation. 6.6.39 The Trust and Safety Operations team uses a number of automated monitoring toolsto identify suspicious accounts, apps and app-related activity. These systems helpdetect suspicious charts and search manipulation. Trust and Safety Operations cantake a range of steps to protect against suspicious charts and search manipulation,which include supressing an app from search for a limited period. They can also takeaction against developers who repeatedly manipulate App Store discovery features,up to and including termination of developer accounts. 6.6.40 The Trust and Safety Operations team evaluates the efficacy of the automated signalsit receives regarding bot accounts and suspicious activity and drives conversationsregarding possible improvements. Non-Confidential Version 75 6.7 App Store User-Generated Content Measures 6.7.1 Pursuant to Article 34(2) of the DSA, Apple is required to assess how its “contentmoderation systems” influence the Systemic Risks. The App Review process isdetailed earlier in this Section. An overview of App Store UGC controls is detailedbelow. 6.7.2 As explained above, the only UGC on the App Store is user-generated app ratingsand reviews. 6.7.3 The Trust and Safety Operations team is responsible for moderating user ratings andreviews, as well as developers’ responses to reviews. It takes both preventative andresponsive steps by way of mitigation of risks arising from UGC, which include thepublication of false, illegal or harmful content, or fraudulent conduct that is designedto manipulate an app’s rating (“Rating and Review” fraud). Without ratings andreviews moderation, misleading and fraudulent information would be spread on theApp Store, which could lead users to download malicious apps. 6.7.4 A number of key process mitigations apply to user submission or ratings and reviews.In particular, ratings and reviews can only be submitted by registered users who havedownloaded the relevant app. Furthermore, all user ratings and reviews are subjectto a publication delay before being published on the App Store. 6.7.5 A number of monitoring processes are carried out to protect against fake orfraudulent reviews, and developer responses, including scanning for spam, profanityand foul language, and multiple duplicate or similar entries. 6.7.6 Reviews can be sorted by helpfulness, rating, or recency. When ordering reviews byhelpfulness, Apple considers the review’s source, quality, thoroughness, andtimeliness as well as how other customers have engaged with the review. 6.7.7 The Trust and Safety Operations team also reacts when it is alerted to potentiallyproblematic ratings and reviews, or developer responses, via “Report a Concern”. Thisfunctionality and related process is described in further detail from paragraph 6.8.9below. 6.7.8 The Trust and Safety Operations team works with a variety of partner teams, includingAppleCare, to continually improve the automated processes that flag and block fakeor fraudulent reviews prior to publication, and the post-publication review andescalation procedures. 6.7.9 In 2022, App Store processed over one billion ratings and reviews, of which more than147 million were blocked and removed for failing to meet its moderation standards. 65 6.8 App Store External Notice and Action Measures 6.8.1 As detailed above, there are multiple proactive controls in the App Store designed tostop problematic apps being published on the App Store. There are further controls 65 https://www.apple.com/newsroom/2023/05/app-store-stopped-more-than-2-billion-in-fraudulent-transactions-in-2022/ Non-Confidential Version 76 in place that ensure that only a smaller subset of apps are recommended to users,either as recommended or editorial content, or as Apple Search Ads. 6.8.2 In addition, there are also various reactive controls in place, which are designed toensure that users, developers, government agencies and others can alert the AppStore to problematic apps that have already been published on the App Store.(a) Report a Problem 6.8.3 The Report a Problem function is a tool to help users raise concerns to the App Reviewteam and other teams about content they may encounter on the App Store.Consumer protection is a priority of the App Store, and an area of focus for the AppStore Trust and Safety Operations team. “Report a Problem” is a cross-functionaleffort which originated from collaboration between Trust and Safety Operations teamengineers and product managers, and their counterparts in the App Review team, andWorld Wide Developer Relations, to create user- and developer-facing solutions toaddress common concerns in the App Store. 6.8.4 The Report a Problem link is displayed in the quick links at the bottom of the Gamesand Apps tabs, or from the product page of any app a user has purchased ordownloaded. Users can choose from “report a scam or fraud” and “report offensive,abusive, or illegal content” options to submit their concern about content they havepurchased or downloaded. Users are presented with a free text field to describe theissue they are reporting. 6.8.5 [CONFIDENTIAL]. 6.8.6 [CONFIDENTIAL]. 6.8.7 [CONFIDENTIAL]. Non-Confidential Version 77 6.8.8 [CONFIDENTIAL].(b)Report a Concern 6.8.9 The Report a Concern tool is another key control which allows users and developersto raise concerns regarding the content of specific user reviews, and developerresponses to such reviews. Concerns can be raised in relation to any content wherereviews are available. 6.8.10 Report a Concern is available to developers in App Store Connect, as well as todevelopers and users on the App Ratings and Review page, where users can pressand hold on the review and Report a Concern will appear in the pop-up menu. TheTrust and Safety Operations team works with AppleCare to review external escalationsraised via “Report a Concern”. 6.8.11 Report a Concern could be used in the following scenarios: (a) Users or developers seeking to flag misleading, offensive, illegal or irrelevantcontent, or content that otherwise violates the Submission Guidelines of theAMS Terms in reviews. All such flagged reviews are subject to moderation. (b) Where a developer may post offensive, illegal, or misleading responses tocritical reviews. (c) Developers are encouraged in the event they see a review of that containsoffensive material, spam, or other content that violates the AMS Terms andConditions, to use the Report a Concern option under the review in App StoreConnect instead of responding to the review. 6.8.12 AppleCare reviews Report a Concern escalations, and performs an initial triage foroffensive content, including illegal content, instances of profanity, solicitation, orspam. Reported concerns go into a queue for the AppleCare team, which is trainedby Trust and Safety Operations on identifying user review violations, and actioning Non-Confidential Version 78 concerns, as well as escalating issues to other relevant teams as necessary. TheAppleCare team receives guidance and training on how to consider a reportedconcern, including investigation, follow-up and escalation paths. 6.8.13 Following its consideration, AppleCare can leave the review as-is, remove a review ordeveloper response, and / or disable the ability to review from a user account. If areported concern contains or a threat or reference to suicide, malicious activity thatinfers bodily harm, child safety and / or child exploitation concerns, or otherwiseindicates a safety issue, the AppleCare team is instructed to send an email to escalatethe matter directly to Trust and Safety Operations. The Trust and Safety Operationsteam will then forward the review and its associated data, including reviewer ID andemail address, to Apple’s Global Security Investigations team for further action, whichmay include alerting law enforcement. Apple has updated its processes to reflect therequirements in Article 18 of the DSA. 6.8.14 AppleCare continuously monitors new trends among the customer concerns beingreported and escalated. AppleCare partners with a variety of teams, including Trust\& Safety Operations, to adapt ratings and reviews detection and response measureswhere appropriate.(c)Notices Routed to App Store Legal 6.8.15 The App Store Legal team is responsible for reviewing and vetting notices fromexternal sources that involve issues with apps in the App Store. Governmentregulatory authorities routinely send notices to the App Store via a dedicated emailinbox, [CONFIDENTIAL]. Such notices typically involve a request for informationabout an app or developer, or demand to take down an app pursuant to local law orcourt order. Likewise, local law enforcement authorities send notices and requestsfor information to a similar dedicated email inbox, lawenforcement@apple.com. Inaddition, customers, developers, government authorities or other parties mayprovide notices to various functions throughout Apple, which are then routed to theApp Store Legal team. 6.8.16 The App Store Legal team works with the App Review team, which reviews andinvestigates the app for any issues identified in the government notice. If the AppReview team identifies a Guideline violation, they will employ standard operatingprocedures to engage the developer and ensure the app is brought into compliancewith the Guidelines, or remove the app and / or terminate the developer, if thecircumstances warrant it. If there is a valid legal basis or government order to removethe app, the App Review team will take appropriate action and may communicate theissue to the developer, as appropriate. This may include removing the app from thelocal storefront in question, to comply with local law. Non-Confidential Version 79 (d)Content disputes 6.8.17 Rights holders can submit App Store content disputes via a dedicated webpage. 66These submissions are routed to the AMS Content Disputes Legal team forconsideration. 6.8.18 Once the AMS Content Disputes Legal team receives a complete complaint, the teamresponds with a reference number.67 They put the complainant in direct contact withthe provider of the disputed app. If needed, complainants can then correspond withthe AMS Content Disputes Legal team directly via email. The parties to the disputeare primarily responsible for its resolution. 6.8.19 However, in certain cases, including where the parties are unable to resolve thedispute bilaterally, the AMS Content Disputes Legal team will intervene. The teamdoes not take apps down solely on the basis of fraudulent or anti-competitive claims,but instead will consider a number of factors when deciding whether or not to removepotentially violative apps from the App Store. These include: (a) whether the app or developer has been the subject of other complaints; (b) the frequency of such complaints; and (c) whether there is reasonable indication that an intellectual property violationhas occurred. 6.8.20 If there are continued violations by a developer or the developer makes fraudulentmisrepresentations of material facts, the AMS Content Disputes Legal team may havea developer’s account terminated. 6.8.21 The AMS Content Disputes Legal team addresses and mitigates risks of potentialintellectual property violations on the App Store, and prevents repeat offenders fromaccessing Apple’s services and causing subsequent infringements. The AMS ContentDisputes Legal team has implemented various controls and processes in order to doso.(e) New Content Reports portal for DSA 6.8.22 Apple enhanced its escalation and reporting mechanisms to adequately capturereported concerns relating to Systemic Risks which may stem from the App Store orits use. In that regard, and in connection with its efforts to comply with Article 16(1)of the DSA, Apple enhanced its Report a Problem feature and created a new ContentReports portal, to enable third parties in the EU to report illegal content. 6.8.23 In August 2023, the Report a Problem flow was updated to achieve integration withthe new Content Reports portal. If a user on a storefront in the EU engages Report aProblem in the App Store, they can select “Report offensive or abusive content” or“Report illegal content” from the menu of options. If they select the former, the user 66 https://www.apple.com/legal/internet-services/itunes/appstorenotices/#/contacts?lang=en67 In the event that a party abandons a claim, Apple has automated templates which are sent out asreminders, and if no response is received, the matter will be recorded as having been closed. Non-Confidential Version 80 goes through the process flow outlined from paragraph 6.8.3et seq. above. If theyselect the latter, they are redirected to the Content Reports portal. 6.8.24 [CONFIDENTIAL]. All remaining notices will undergo manual triage beforesubmission to App Review. Manual triage will help Apple track and understand thekinds of notices it receives, [CONFIDENTIAL] and help identify possible misuse andabuse of the system. Once a notice passes through these triage systems, anautomatic acknowledgment communication will be sent to the notifier. 6.8.25 After undergoing a verification process intended to safeguard the system and preventabuse, government representatives (and in due course trusted flaggers) can submitnotices which bypass the triage systems and are processed on an expedited basis.Government representatives and trusted flaggers will also receive acknowledgmentcommunications when their notice is submitted to App Review for analysis. 6.8.26 The App Review team collaborates with relevant internal teams and partners,including the App Store Legal team when appropriate, to review, analyse, and actionthe notices. Once an action is taken, the Content Reports portal facilitates necessarycommunications to notifiers and designated appointees about the actions taken, andwhen necessary, to impacted consumers who purchased illegal products or services. 6.8.27 If a notifier disagrees with an outcome, they have the option to challenge the decisionvia https://contentreports.apple.com/Complaints. These complaints are receivedthrough a separate section of the Content Reports Portal and are routed to seniorApp Review analysts for review. The senior App Review analyst reviews the originalnotice alongside any new information provided by the complainant. These seniorApp Review analysts partner with relevant internal teams, including the App StoreLegal team where necessary, to evaluate the complaints. Some matters may beescalated for review by the ERB. Communications are sent to complainants as part ofthis process. 6.8.28 In order to meet the DSA transparency reporting obligations, data is collectedthroughout the various steps in the described content reporting flow. 6.9 New DSA Compliance function 6.9.1 In order to meet the requirements of the DSA, Apple has established a DSACompliance function, within Apple’s Compliance and Business Conduct Department. 6.9.2 On 24 August 2023, the ADI Board formally appointed the Head of DSA Compliance.The individual in question is an experienced compliance professional, with extensivecompliance experience, who has the required professional qualifications, knowledgeand ability to fulfil the role. The individual in question has an in-depth knowledge ofApple’s products and services and has for many years been responsible for internaland external risk management and risk mitigation strategies, including across the EU. Non-Confidential Version 81 6.9.3 The DSA Compliance function is functionally independent from Apple’s operationalfunctions. The Head of DSA Compliance reports directly to the ADI Board on mattersrelating to DSA compliance. 6.9.4 Pursuant to Article 41(2) of the DSA, the Head of DSA Compliance has ultimateresponsibility for, inter alia: (a) cooperating with the Digital Services Coordinator to be designated by Irelandand the Commission for the purpose of the DSA; (b) ensuring that all risks referred to in Article 34 of the DSA are identified andproperly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35 of the DSA; (c) organising and supervising the activities of the independent audit that ADI willprocure in accordance with Article 37 of the DSA; (d) informing and advising relevant Apple management and employees aboutrelevant obligations under the DSA, including planned training on DSA; and (e) monitoring Apple’s compliance with its obligations under the DSA. 6.9.5 The Head of DSA Compliance is supported in this role on a day-to-day basis by anumber of legal and other functions responsible for work relating to the App Store,including the App Store Legal team, EU Regulatory Legal, and Privacy Compliance. 6.10 New DSA Information site 6.10.1 Apple has created a new DSA information site - https://www.apple.com/legal/dsa/,which contains: (a) the contact details of the DSA Head of Compliance, as the DSA Articles11 and 12 designated point of contact for communications with MemberState authorities, the European Commission, the European Board forDigital Services, and developers and users of the App Store; (b) a link to the new Content Reports portal; (c) a link to the new Ads Repository; (d) a link to the DSA redress page. This lists redress options for anyone whohas filed an Article 16 Notice via the Content Reports portal and whowants to challenge Apple’s decision, as well redress options fordevelopers and users who want to challenge decisions Apple has taken.The page will be updated in the future as Article 21 out-of-courtsettlement bodies are established; and (e) a link to the average monthly recipients report. Non-Confidential Version 82 6.10.2 Additional resources, for example transparency reports, will be added to thesite in due course. Non-Confidential Version 83 SECTION 7: REASONABLENESS, PROPORTIONALITY AND EFFECTIVENESS OF APP STORERISK-MITIGATION MEASURES 7.1 Section overview 7.1.1 Pursuant to Article 35 of the DSA, Apple is required to implement “reasonable,proportionate and effective mitigation measures tailored to the specific systemic risksidentified pursuant to Article 34, with particular consideration to the impacts of suchmeasures on fundamental rights”. 7.1.2 This Section of the Report sets out why Apple considers that the existing riskmitigation measures detailed in this Report, as supplemented by the new riskmitigation measures Apple has implemented, or will be required to implement, inorder to comply with the DSA, are reasonable, proportionate and effective to addressthe Systemic Risks described in Section 5 that could stem from the design, functionor use of the App Store. 7.2 The App Store and its approach to risk mitigation 7.2.1 The App Store’s risk mitigation measures have been developed with the benefit ofthe experience of inventing and establishing the wholly novel business modelunderlying the App Store, and the subsequent 15 years’ experience of operating theApp Store, dealing throughout that period with issues engaging or potentiallyengaging manifold risks, including the Systemic Risks, and developing andcontinuously improving the controls environment applicable to the App Store. 7.2.2 Apple notes Recital 79 of the DSA states that “[VLOPs...] can be used in a way thatstrongly influences safety online, the shaping of public opinion and discourse, as wellas online trade. The way they design their services is generally optimised to benefittheir often advertising-driven business models and can cause societal concerns .”While Apple agrees that trust and safety are key considerations for the App Store, itis clearly not the case that the App Store is optimised to benefit an advertising-drivenbusiness model. 7.2.3 The success of the App Store has been built upon end users’ trust that all appsavailable on the App Store respect the high standards of security, privacy,performance, user safety and product integrity to which Apple is committed. Thisbenefits end users, who rely on the App Store as a trusted place where they candownload apps that have been subject to both automated and human review. It alsobenefits developers, who rely on it as a way to connect to potential customers acrossthe EU and around the world. 7.2.4 It is widely recognised that Apple effectively manages risks relating to the App Store.This success is demonstrated by the significant number of apps and developers whichApple keeps out of the App Store each year, compared to the relatively fewoccurrences of problematic apps being in the App Store and the swiftness with whichany such examples are addressed. 7.2.5 The experience to date therefore points to Apple having struck a reasonable balancein maintaining a safe, predictable and trusted online environment, while at the same Non-Confidential Version 84 time recognising and effectively mitigating relevant risks, including protectingfundamental rights. Nonetheless, Apple’s guiding principle for the App Store – toprovide a safe and trusted place for customers to discover and download apps – iswholly aligned with the legislative purposes underpinning the DSA. 7.2.6 Apple is conscious that no compliance framework – nor any individual risk mitigationmeasure – operates with a 100% success rate. The hallmark of an effective complianceframework is that it earnestly and efficaciously addresses known risks, and evolvesand adapts promptly to address new and emerging risks. That is undoubtedly thecase with the App Store risk mitigation framework. As has been the case throughoutthe existence of the App Store, Apple will continue to keep the ongoing effectivenessof the App Store controls and risk mitigation measures under continuous review toaddress the evolving risk environment the App Store faces. 7.3 Reasonableness, proportionality and effectiveness of App Store risk-mitigation measures 7.3.1 None of the terms “reasonableness ”, “proportionality” or “effectiveness” are definedin the DSA; nor is there an equivalent regime to Articles 34 and 35 of the DSA to befound elsewhere in theacquis communautaire. 7.3.2 The risk mitigation measures in place and required in connection with the App Storecan be considered on the basis of the ordinary, natural meaning of these words.Nonetheless, with a view to benchmarking those measures against comparableexisting standards, Apple has considered the use of these words (or the use ofanalogous standards) in leading governmental guidance in jurisdictions outside theEU relating to the evaluation of corporate compliance structures, including the U.S.Department of Justice Guidance for Prosecutors – “Evaluation of CorporateCompliance Programs ” (Updated March 2023) (the “US DOJ Guidance”), the U.K.Ministry of JusticeGuidance about procedures which relevant commercialorganisations can put into place to prevent persons associated with them frombribing (section 9 of the Bribery Act 2010) (the “UK MoJ Guidance”) and the UK HMRevenue \& Customs guidance of September 2017Tackling tax evasion: Governmentguidance for the corporate offences of failure to prevent the criminal facilitation oftax evasion (the “UK HMRC Guidance”), and within the EU, in the form of the FrenchAnti-Corruption Agency Guidelines of January 2021 on the compliance arrangementsrelevant French companies need to establish in order to have “effective” complianceprograms under the French anti-corruption law, the Loi Sapin II. 7.3.3 Although these guidance publications were developed in order to inform theevaluation of corporate risk-mitigation measures in criminal law (anti-corruption)contexts, they provide helpful indications as to the elements expected by leadingenforcement authorities of an effective corporate compliance programme generally.Apple has drawn inspiration from these leading global standards in considering thereasonableness, proportionality and effectiveness of risk mitigation measures neededin respect of the App Store. Non-Confidential Version 85 7.3.4 The UK MoJ Guidance and the UK HMRC Guidance adopt a practical approach,focussing on six general guiding principles which should inform such complianceprogrammes: risk assessment, proportionality of risk-based mitigation measures; toplevel commitment within the company; due diligence; communication, includingtraining; and monitoring and review. Apple has considered each of these elementsin considering what is required to satisfy itself as to the reasonableness,proportionality and effectiveness of risk-mitigation measures in place in respect ofthe App Store. 7.3.5 The US DOJ Guidance takes a step back, and invites those assessing a corporatecompliance programme to consider three questions: (a) “Is the corporation’s compliance program well designed?” Apple hasconsidered this question when assessing the reasonableness andproportionality of the App Store risk mitigation measures detailed in Section 6. (b) “Is the program being applied earnestly and in good faith? In other words, isthe program adequately resourced and empowered to function effectively?”Apple has considered this question when assessing the proportionality andeffectiveness of the App Store risk mitigation measures detailed in Section 6. (c) “Does the corporation’s compliance program work in practice?” Appleconsidered this question when assessing the effectiveness of the App Store riskmitigation measures detailed in Section 6. The DOJ Guidance notes that thequestion of effectiveness is a complex one, but recognises that no complianceprogram can be designed to address all breaches. 7.3.6 The French Anti-Corruption Agency’s Guidelines take a more prescriptive approach,specifying necessary elements of an appropriate corporate anti-corruptioncompliance program, to include: (a) a code of conduct; (b) an internal whistleblowing mechanism; (c) a corruption risk-mapping system; (d) a third-party risk assessment process; (e) internal and/or external accounting controls; (f) training programs for employees exposed to higher risk; (g) a disciplinary procedure for breaches by employees; and (h) an audit mechanism. 7.3.7 As regards DSA compliance-related risk mitigation measures in respect of the AppStore, elements (c), (d) and (h) from this list are an integral part of the mandatoryrequirements for VLOPs under Articles 34 and 37; (b) is reflected in the various noticeand actions mechanisms relevant to the App Store, including, inter alia, Report aProblem and Report a Concern; (f) is reflected in both the existing training provided Non-Confidential Version 86 to App Reviewers and content moderation specialists (to be supplemented with DSA-specific training provided to such personnel on the Systemic Risks); and (a), (e) and(g), while not relevant to Systemic Risk mitigation, find analogues in, respectively, theGuidelines, Apple’s ongoing App Review of live apps and content moderation; andApple’s active enforcement of the Guidelines in the event of violation. 7.4 Reasonableness, proportionality and effectiveness of the risk mitigation measures designed to address the Systemic Risks identified in Section 5 7.4.1 Below, Apple addresses the reasonableness, proportionality and effectiveness of itsrisk mitigation measures that apply to the Systemic Risks identified in Section 5. 7.4.2 Apple notes at the outset that the scale and comprehensiveness of the risk mitigationmeasures applicable to the App Store strongly support the view that the riskmitigation measures are reasonable and proportionate. It is Apple’s commercialimperative to keep the App Store a safe and trusted place and it invests heavily in itsrisk mitigation measures to achieve this. 7.4.3 The DSA provides no meaningful indication as to the standard against which theeffectiveness of risk mitigation measures is to be assessed, nor is there a readyanalogue in other EU compliance obligations. Against this background, someinspiration may be drawn from theacquis communautaire, informed by the case lawof the European Court of Human Rights, in relation to the right to an effective remedyfor government violation of such rights under Article 47 of the Charter and Article 13of the European Convention on Human Rights and Fundamental Freedoms. Thesettled jurisprudence on the right to an effective remedy focuses on there being aremedy capable of leading to the identification and resolution of a violation, andacknowledges that there is no requirement that the remedy go so far as to beingcapable of preventing the breach arising. In sum, the jurisprudence requires nationalcourts to strike an appropriate, proportionality-based balance between the need tosecure EU law rights in the national legal order and the application of domesticprocedural and remedial rules. In Apple’s view, the risk mitigation measures it deploysare effective, and strike the appropriate balance between the interests at stake inconnection with the Systemic Risks and any other countervailing considerations, andeffectively address the risks identified.7.4.4 It bears repeating here that the following controls operate to address each of theSystemic Risks identified in Section 5, or to the extent any of those risks conflict, toeffectively strike a balance. 7.4.5 First, both developers and users who engage with the App Store are subject to clearwritten terms, which are available online. Users’ engagement with the App Store isgoverned by the AMS Terms, which provide a basis for Apple to take action against auser who does not comply. Developers’ engagement with the App Store is governedby the ADA and DPLA, which are similarly readily enforceable against non-compliantdevelopers. Both of these agreements clearly set out Apple’s expectations withrespect to security, privacy, performance, user safety and product integrity. Again, Non-Confidential Version 87 these documents provide Apple with a clear basis for taking action against developerswho do not comply. 7.4.6 Second, all developers who want to publish apps on the App Store are subject todeveloper screening measures, both at onboarding and on an ongoing basis. Theenrolment screening process helps Apple stop fraudulent or sanctioned developersfrom developing and distributing apps that may contain illegal or harmful contentfrom gaining access to the App Store. Some malicious developers try to regain accessto the App Store and developer screening measures serve as an important gatewayto keep them off or remove them from the App Store. 7.4.7 Third, the Guidelines set a clear and transparent standard for all apps and app updatesthat will be published on the App Store. The Guidelines are subject to periodic review,updates, and additions, which offer opportunities to enhance the Guidelines andaddress risk generally, including the Systemic Risks. 7.4.8 Fourth, all apps and app updates published on the App Store are subject to two levelsof review. First, automated review gathers information that can be interpreted bymachine learning algorithms and analysed for threats and signals (for example, thepresence of malicious URLs or executable code) that provide relevant app informationto the human review component. Second, all apps are subject to human review,where app reviewers analyse the signals provided by automated systems and reviewthe features and functionality of apps to ensure they are compatible with the AppStore’s systems and products, comply with the Guidelines, and do not give signs ofpotential deceptive, abusive, or otherwise harmful behaviour. 7.4.9 A team of over 500 human app reviewers rigorously enforce the Guidelines. Theirwork is subject to ongoing monitoring and review. On a daily basis, the App Reviewteam escalates app issues to senior management in the App Review team and theApp Store Legal team. Certain issues are escalated to the ERB for consideration. 7.4.10 Fifth, even after apps are approved for publication on the App Store, they are subjectto ongoing monitoring. Apple has a number of automated tools in place to detectmalware on existing apps, that it runs at periodic intervals to capture content atdifferent times. This includes tools to identify “bait-and-switch” apps, where appsavailable on the App Store change or add new functionality after approval by the AppReview team. 7.4.11 Sixth, for published apps, the App Store provides avenues for consumers, developers,government authorities and others to provide notices and alerts of potentialproblems or concerns with apps or app content, and numerous teams within the AppStore can and do act on these concerns. This includes the new DSA Content Reportsportal. 7.4.12 As noted at paragraph 5.11.2 above, there is inevitably some measure of risk arisingfrom the fact that the existing risk mitigation measures in place cannot be expectedto have a 100% success rate to mitigate the Systemic Risks which may stem from theApp Store, particularly as the nature of threats evolve. However, given that controls Non-Confidential Version 88 exist at different stages of the app lifecycle, these proactive and reactive steps ensurethat threats to users who engage with the App Store are actively minimised.(a)Article 34(1)(a) – Dissemination of Illegal Content (i) Risk profile 7.4.13 As noted in Section 5, there is a material risk that, absent appropriate risk mitigationmeasures, the App Store could be used to disseminate illegal content to users in theEU. This includes the App Store being used to facilitate the infringement ofintellectual property rights, and apps that facilitate fraud and other illegal behaviours,or defamatory material. 7.4.14 With respect to users, Apple has concluded that the risk that App Store-hosted UGCmay give rise to the dissemination of illegal content is low to moderate. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.15 The Submission Guidelines in the AMS Terms clearly prohibit the posting ofobjectionable, offensive, unlawful, deceptive, inaccurate or harmful content by usersin ratings and reviews. 7.4.16 The DPLA also clearly prohibits developers from using the App Store to disseminateillegal content. The DPLA expressly provides that developers must not use the AppStore to engage in unlawful or illegal activity, develop products which would commitan offence or facilitate the commission of a crime or civil wrong, threaten, incite orpromote violence or terrorism, or other serious harm, create or distribute any contentor activity that promotes CSAM, or that violates, misappropriates or infringes on theintellectual property rights of others. 7.4.17 Section 1.1 of the Guidelines prohibits objectionable content, including defamatorycontent. 7.4.18 Section 4.1 of the Guidelines prohibits apps which impersonate other apps or services. 7.4.19 Section 5 of the Guidelines states apps must comply with all legal requirements inany location where developers make them available, and specifies that the developeris responsible for understanding and ensuring their app conforms with all local laws. 7.4.20 In addition, Section 5 notes apps that solicit, promote or encourage criminal or clearlyreckless behaviour are unacceptable, and warns that in extreme cases, such as appsthat are found to facilitate human trafficking and/or the exploitation of children, theappropriate law enforcement authorities will be notified. 7.4.21 Section 5.2 requires developers to only include content in their app if they own it orare licensed or otherwise have permission to use it. 7.4.22 The Apple Search Ads terms and conditions also require developers to ensure thatApple Search Ads are legal in the country in which the ads will be presented to users. Non-Confidential Version 89 (iii) App Review 7.4.23 Both automated review and human app review consider app submissions for illegalcontent. With respect to automated review, this includes for example URL detectionwhich analyses URLs that have been previously flagged for illegal or harmful contentor characteristics. Post-publication, these automated systems also detect bait-and-switch tactics, which can facilitate illegal conduct. Human app reviewers also revieweach and every app submission and app update for potential legal issues and risks,including unlicensed content, CSAM, real money gaming, and terrorist content. (iv) Additional specific controls 7.4.24 App Store fraud mitigation measures address the risk of the App Store being used tofacilitate fraud. These measures include different forms of fraud detection and in2022 prevented over USD two billion in fraudulent transactions. 7.4.25 The AMS Content Disputes process provides a mechanism for third parties to submitcontent disputes relating to the App Store via a dedicated webpage. Although theparties to the dispute are primarily responsible for its resolution, the AMS ContentsDisputes team can and does intervene, particularly in cases where the developer hasbeen the subject of multiple complaints or where there is a reasonable indication thatan IP violation has occurred. 7.4.26 For apps live on the store, the App Store provides avenues for consumers, developers,government authorities and others to provide notice of potential problems orconcerns with apps or app content that may be illegal. This includes the new ContentReports portal. Escalation mechanisms exist to ensure that apps comply with theGuidelines and local law, and are removed where there are violations. 7.4.27 Notwithstanding that the risk that App Store-hosted UGC may give rise to thedissemination of illegal content is low to moderate, all ratings and reviews are subjectto content moderation. This includes proactive measures including automatedscanning of all ratings and reviews, and reactive measures in circumstances whereApple is made aware of problematic ratings and reviews. In situations where ratingsand reviews are escalated for further investigation, for example in cases where areported concern relates to a rating and review that contains malicious activity thatinfers bodily harm, or child safety and / or child exploitation concerns, these areaddressed, for example by Global Security Investigations, and may result in a reportto law enforcement. (v) Effectiveness 7.4.28 Terms and conditions prohibiting the dissemination of illegal content are vigorouslyand fairly enforced; they provide a basis for Apple to take fair and predictable actionagainst developers and users who do not comply with the rules, including the removalof apps and termination from the App Store; and Apple does in fact take such action,extending not only to criminal content, but to a wide range of other illegal content. 7.4.29 Examples of apps recently removed or rejected from the App Store due to illegalcontent include an app with defamatory and antisemitic language in the metadata Non-Confidential Version 90 and an app listed all Wi-Fi hotspots in the app with offensive and homophobic titles(rejected under Guideline 1.1 for Objectionable content), an app impersonating theapp of a verified developer (rejected under Guideline 4.1 for Copycat violations) andan app that used unlicensed song lyrics and also appeared to use a copycat userinterface (rejected under Guideline 5.2 on Piracy). 7.4.30 Given the limited risk profile of Apple Search Ads, Apple considers its relevant termsand conditions and their enforcement are adequate to address any Systemic Risksengaged by Apple Search Ads.(b)Article 34(1)(b) – Actual or foreseeable negative effects on rights to human dignity and respect for private and family life, enshrined in Articles 1 and 7 of the Charter (i) Risk profile 7.4.31 As noted in Section 5, absent appropriate risk mitigation measures, thelikelihood of developers seeking to publish apps capable of engaging therights to human dignity and respect for private and family life in such a way asto give rise to Systemic Risks would be high, and the severity of such riskscould vary from modest to extreme (for example, in the cases of CSAM, so-called “revenge pornography”, or “deepfakes”). (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.32 The Submission Guidelines in the AMS Terms and Conditions clearly prohibit theposting of objectionable, offensive, unlawful, deceptive, inaccurate or harmful contentby users in ratings and reviews. 7.4.33 The DPLA also clearly prohibits developers from using the App Store to engage inunlawful or illegal activity; threaten or incite violence, terrorism, or other serious harm;or create or distribute any content or activity that promotes child sexual exploitationor abuse. 7.4.34 Section 1.1.1 of the Guidelines (Safety) prohibits apps that contain defamatory,discriminatory, or mean-spirited content, including references or commentary aboutreligion, race, sexual orientation, gender, national / ethnic origin, or other targetedgroups, particularly if the app is likely to humiliate, intimidate or harm a targetedindividual or group. 7.4.35 Section 1.1.2 prohibits realistic portrayals of people being killed, tortured or abused,or content that encourages violence. 7.4.36 Section 1.1.3 prohibits depictions that encourage violence, or illegal or reckless useof weapons. 7.4.37 Section 1.1.4 prohibits overtly sexual or pornographic material. This includes “hookup”apps and other apps that may include pornography or be used to facilitateprostitution, or human trafficking and exploitation. Non-Confidential Version 91 7.4.38 Section 1.1.7 prohibits apps that contain harmful concepts which capitalise on currentevents. 7.4.39 Section 1.2 of the Guidelines requires apps with UGC to include methods for filteringobjectionable content, mechanisms for reporting offensive content, the ability toblock offensive users from the service, and published developer contact details. Italso provides that apps with UGC or services that end up being used primarily forpornographic content, Chatroulette-style experiences, objectification of real people(for example “hot-or-not” voting), making physical threats, or bullying may beremoved from the App Store without notice. 7.4.40 Section 1.4 of the Guidelines warns that apps that present risks of serious harm maybe rejected. 7.4.41 Section 5 of the Guidelines (Legal) notes apps that solicit, promote, or encouragecriminal or clearly reckless behaviour are unacceptable, and warns that in extremecases, such as apps that are found to facilitate human trafficking and / or theexploitation of children, the appropriate authorities will be notified. (iii) App Review 7.4.42 Both automated review and human app review consider app submissions that mayengage these rights, although given their nature, apps submissions that have actualor foreseeable negative effects on these rights are more likely to be addressed viahuman review. (iv) Additional specific controls 7.4.43 Where the App Store is alerted to risks of CSAM being disseminated on apps that areavailable on the App Store, they escalate issues to Child Safety Counsel (seeparagraphs 6.8.7 to 6.8.8 above). All such escalations are investigated and ifappropriate notified to law enforcement. 7.4.44 To the extent that suspected criminal offences involve threats to the life or safety ofa person or persons as envisaged by Article 18 of the DSA engage the right to rightto human dignity, the new Article 18 procedures applicable to the App Store aredesigned to ensure that law enforcement authorities in the Member States concernedare notified in a timely manner. 7.4.45 Where new or novel issues involving human dignity are identified, they are escalatedto App Review Policy and other teams for consideration. (v) Effectiveness 7.4.46 The App Store considers and takes action against apps that give rise to actual orforeseeable negative effects on rights to human dignity and respect for private andfamily life. For example, Guideline 5 Legal: has recently been used to consider appsgiving rise to risk of use for the purposes of modern slavery, including child labour,and human trafficking; to remove video call and chatroom apps identified as carryingCSAM content; and to address apps incorporating social media features which areidentified as being used for bullying, threats and other abuse. Non-Confidential Version 92 (c)Actual or foreseeable negative effects on developers’ and users’ rights to the protection of personal data enshrined in Article 8 of the Charter (i) Risk profile 7.4.47 As noted in Section 5, absent risk mitigation measures, there would be a significantrisk that there could be negative effects on developers’ and users’ rights to theprotection of their personal data. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.48 Section 5 of the Guidelines (Legal) makes clear that protecting user privacy isparamount in the Apple ecosystem. It also prohibits developers from using,transmitting or sharing a user’s personal data without first obtaining their permission,and requires developers to provide access to information about where and how auser’s personal data will be used. Explicit permission must be obtained from the userin order to track their activity, via the App Tracking Transparency API. 7.4.49 The DPLA also requires developers and their apps to comply with all applicableprivacy and data collection laws and regulations with respect to any collection, use ordisclosure of user or device data (e.g. a user’s IP address, the name of the user’s device,and any installed apps associated with a user). 7.4.50 The Submissions Guidelines in the AMS Terms also clearly prohibit the posting ofpersonal, private or confidential information belonging to others, or requestingpersonal information from a minor. 7.4.51 Section 5 of the Guidelines (Legal) warns that apps which share user data without userconsent or which otherwise do not comply with data privacy laws may be removedfrom sale, and may also result in the developer’s removal from the Apple DeveloperProgram. 7.4.52 In addition, the App Store \& Privacy Notice ensures that users have an effective choiceand any consent to data use on Apple products is fully informed. Apple’s Data \&Privacy Icon also provides users with transparent and easily accessible informationthat details how Apple collects, processes and discloses their personal data. (iii) App Review 7.4.53 Both automated review and human review consider app submissions for privacyprotections and compliance with Apple’s privacy requirements. For example,automated review involves checks [CONFIDENTIAL]. Human reviewers then consider[CONFIDENTIAL], including permission requests to seek the user’s permission forsuch access, are consistent with the purported functionality and purpose of the app.They also ensure that developers have complied with all privacy- related Guidelinesrequirements, including requirements to publish privacy policies. Non-Confidential Version 93 (iv) Additional specific controls 7.4.54 Product pages in the App Store feature a section that includes summaries prepared by developers of their key privacy practices in a simple, easy‑to‑read label, which informs the user about the app’s privacy practices before downloading it. Theselabels show how developers are collecting and using users’ data, such as a user’slocation, browsing history, and contacts. 7.4.55 App Privacy Reports enable users to see how often their location, photos, camera,microphone, and contacts have been accessed by apps during the last seven days,and which domains those apps have contacted. Users therefore have full and easyvisibility into the ways apps use the privacy permissions a user has granted them, aswell as their respective network activity. 7.4.56 Developers who want to track a user across apps and websites or access their device’sdata for advertising purposes must seek the user’s permission through the AppTracking Transparency Framework. This applies across all apps available on the AppStore, including Apple’s own apps. 7.4.57 Apps may request access to features such as a user’s location, contacts, calendars, orphotos. The App Sandbox protects user data by limiting access to resourcesrequested through entitlements. Users receive a prompt with an explanation the firsttime an app wants to use this data, allowing them to make an informed decisionabout granting permission. 7.4.58 Users are able to determine whether they receive personalised recommendationswhen they are discovering apps on the App Store. If Personalised Recommendationsis turned off, a user will not receive personalised recommendations or editorialcontent. Instead, recommendations from the app repository will display apps withoutreference to the user’s engagement with the App Store. (v) Effectiveness 7.4.59 With specific respect to the right to protection of personal data, in addition to themeasures above, the effectiveness of Apple’s risk mitigation measures is ensuredfirstly by Apple’s ongoing compliance with GDPR, and secondly by putting usersfirmly in control of the management of their own data when using the App Store. Inaccordance with Article 24 of the GDPR, the measures implemented by Apple takeaccount of the nature, scope, context and purposes of processing as well as the risksof varying likelihood and severity for the rights and freedoms of natural persons.These measures are subject to continuous review.(d) Actual or foreseeable negative effects on the rights of developers and users to freedom of expression and freedom of information, including the freedom and pluralism of the media, under Article 11 of the Charter (i) Risk profile 7.4.60 As noted in at Section 5 above, while such risks to freedom of expression andinformation may conceivably arise in connection with the App Store, the probability Non-Confidential Version 94 of negative effects on these rights arising in practice can only reasonably be seen asremote; and their impact, should they arise, modest. 7.4.61 As regards the freedom and pluralism of the media, as noted at paragraph 5.7.13,notwithstanding the risk of abusive governmental takedown demands, the risk ofnegative impacts on pluralism of the media in the EU stemming from the App Storeis, on any objective analysis, low. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.62 The Introduction to the Guidelines states clearly that Apple strongly supports allpoints of view being represented on the App Store, as long as the apps are respectfulto users with differing opinions and the quality of the app experience is high. 7.4.63 Apple notes that such is its commitment to pluralism of the media that it uniquelyand exceptionally exempts professional political satirists and humourists from itsprohibition in Guideline 1.1.1 on defamatory, discriminatory, or mean-spirited content,including references or commentary about religion, race, sexual orientation, gender,national / ethnic origin, or other targeted groups. 7.4.64 The AMS Terms permit users to post reviews of apps they have downloaded, providedthey comply with the Submissions Guidelines, such restrictions being designed tokeep the App Store a safe and trusted place for all. (iii) App Review 7.4.65 All apps will be admitted to the App Store unless they are illegal or in violation of theDPLA or Guidelines, which are publicly available. Where app submissions raise novelhuman rights issues, including issues that engage freedom of expression, they can beescalated as appropriate to the various support teams that support App Review,including App Review Policy, the App Store Legal team, and if necessary the ERB.When apps are rejected, developers have a resource to challenge rejection decisionsvia the appeals process. (iv) Additional specific controls 7.4.66 When Apple receives government takedown requests targeted at the media apps orjournalist content, they are addressed in accordance with the escalation proceduresdetailed in Section 6 above. The App Store Legal team and other functions assesswhether the app complies with the Guidelines, and whether the request is inaccordance with local law (both as to substance as well as whether the agency hasthe authority to make the request). App Store Legal will in some instances consultwith local counsel on the legality of the request. The App Store Legal team can alsoescalate requests to the ERB for consideration. If a request is in accordance with locallaw the media app may be removed form a local App Store Storefront. Requests thatare not in accordance with local law would only be actioned if the app otherwiseviolated the Guidelines. Non-Confidential Version (v) Effectiveness 95 7.4.67 A broad range of views and opinions from across the EU are available on the AppStore. The App Store risk mitigation measures balance the tension between freedomof expression and the need to keep users safe. 7.4.68 A very broad range of media voices across the EU are present on the App Store.Consideration of the issue of media pluralism by the UK’s specialist communicationsand media regulator, Ofcom, has not identified concerns for media pluralismstemming from the App Store. Apple is not aware of material concerns being raisedin any other quarter with respect to negative effects in the EU for media pluralismstemming from the App Store. In those circumstances, Apple has no reason to believethat its risk mitigation measures are anything other than effective with respect tomitigating the risk of actual or foreseeable negative effects for the exercise offreedom of expression and information, and for media pluralism.(e)The right to non-discrimination under Article 21 of the Charter (i) Risk profile 7.4.69 As noted in Section 5 above, Apple does not discriminate against developers or users,including when conducting developer screening, App review, or responding tonotices and actions (including from law enforcement). As regards apprecommendations and Apple Search Ads, if a user has personalisation turned on, age,gender and location are used to present personalised content, but such conduct doesnot amount to discrimination. 7.4.70 As regards developer use, although discriminatory content is clearly prohibited underthe Guidelines, there is a risk that users could be exposed to such content in the AppStore if it were not identified during the App Review process. However, app reviewersare trained to identify such content, and the notices and actions and complaintsmechanisms provide means to raise relevant concerns regarding apps that are alreadypublished on the App Store. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.71 Section 1.1.1 of the Guidelines (Safety) prohibits apps that contain defamatory,discriminatory, or mean-spirited content, including references or commentary aboutreligion, race, sexual orientation, gender, national/ethnic origin, or other targetedgroups, particularly if the app is likely to humiliate, intimidate or harm a targetedindividual or group. 7.4.72 The Developer Code of Conduct prohibits developers from engaging indiscriminatory practices, and notes that repeated manipulative or misleadingbehaviour will lead to their removal from the Apple Developer Program. (iii) App Review 7.4.73 As part of the App Review process, human app reviewers ensure that app metadata,including text and images that will appear on the App Store comply with theGuidelines, including those relevant provisions listed above. Non-Confidential Version 96 (iv) Effectiveness 7.4.74 Apple is not aware of any concerns from developers or users that Apple discriminatesagainst them when attempting to gain access to the App Developer Program. 7.4.75 As regards App Store content, App Review scrutinises app metadata whensubmissions are made to the App Store and any content that is discriminatory andtherefore not in compliance with the Guidelines will not be admitted to the App Store.Examples of apps rejected or removed for violating the Guideline prohibition ondiscriminatory content include an app that had defamatory and antisemitic contentin the app metadata, an app that included racist, homophobic and other derogatoryposts, and an app that referred to certain groups as Nazis.(f)Actual or foreseeable negative effects on the rights of the child enshrined in Article 24 of the Charter (addressing also the risk of negative effects in relation to the protection of minors, under Article 43(1)(d)) (i) Risk profile 7.4.76 As noted in Section 5, absent appropriate risk mitigation measures, the App Storecould give rise to, or be used in a manner giving rise to, risks of actual or foreseeablenegative effects for the exercise of the rights of the child under Article 24 of theCharter. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.77 The AMS Terms set out the requirements for “Family Sharing” accounts. The “familyorganizer” must be 18 or an equivalent age of majority in their country or territory ofresidence, and the parent or legal guardian of any users under age 13 (or equivalentage in their country or territory of residence). The terms also explain how purchasesharing works, and the ways in which eligible content is shared among members of afamily, including the “Ask to Buy” feature. 7.4.78 The Submissions Guidelines in the AMS Terms prohibit various forms of misuse,including using the App Store to request personal information from a minor. 7.4.79 Section 2.4 of the Schedules to the DPLA provides that the developer is responsiblefor determining and implementing any age ratings or parental advisory warningsrequired by the applicable government regulations, ratings board(s), service(s), orother organisations for any content offered in their app. These age ratingdeterminations are considered during App Review. 7.4.80 The introductory section to the App Review Guidelines reminds developers:“We havelots of kids downloading lots of apps. Parental controls work great to protect kids,but you have to do your part too. So know that we’re keeping an eye out for the kids.” 7.4.81 Section 1.3 (Kids category) provides that apps in the “Kids” category must not includelinks out of the app, purchasing opportunities, or other distractions to kids unless Non-Confidential Version 97 reserved for a designated area behind a “parental gate”. 68 In addition to complyingwith privacy laws applicable to children, Kids Category apps may not send personallyidentifiable information or device information to third parties and should not includethird-party analytics or third-party advertising. 7.4.82 Section 2.3.8 requires all app metadata, including apps and in-app purchase icons,screenshots, and previews to adhere to a 4+ age rating, even if the app is rated higher.By way of example, even if a developer’s game that includes violence, images on theApp Store should not depict a gruesome death or a gun pointed at a specificcharacter. 7.4.83 Section 5.1.4 addresses personal privacy and data requirements for children: appsmust comply with all children data protection laws (for example GDPR); apps shouldnot include third-party analytics / advertising if intended for kids; use of terms like“For Kids” and “For Children” is reserved for the Kids Category; and apps not in theKids Category cannot imply the app is for children. (iii) App Review 7.4.84 As part of the App Review process, human app reviewers assess whether apps alignwith the age ratings guidelines, and if a developer has submitted a proposed app tofeature in the Kids Category, to assess that the app meets the Kids Categoryguidelines. (iv) Additional specific controls 7.4.85 All privacy-related controls listed above apply to minors. 7.4.86 Where the App Store is alerted to risks of CSAM being disseminated on apps that areavailable on the App Store, they escalate issues [CONFIDENTIAL] (see paragraphs6.8.7 to 6.8.8 above). All such escalations are investigated and if appropriatenotified to law enforcement. 7.4.87 Apple Search Ads is built with strong limitations to protect children and minors. Forexample, for a minor under 18 (or the age of majority in the relevant jurisdiction) whois logged in with their Apple ID account, the Personalised Ads setting is automaticallyset to “off” and cannot be enabled until the user reaches the age of majority.Furthermore, age ratings and the age of the user determine whether or which AppleSearch Ads will be displayed to users under 18 years of age; Apple Search Ads are notpresented to users under the age of 13; apps rated 17+ are not presented to usersunder 18 as Apple Search Ads. (v) Effectiveness 7.4.88 The App Store is not a service that is directed at or predominantly used by minors.However, Apple recognises minors access apps available on the App Store and 68 A parental gate presents an adult-level task that must be completed in order to continue. The AppStore provides developers with guidance regarding the creation of parental gates here:https://developer.apple.com/app-store/kids-apps/ Non-Confidential Version 98 maintains controls to ensure that they are protected. Apple has created device levelcontrols, such as Screen Time, to give parents control over apps that their childrencan download and use on their devices. 7.4.89 Even if parents chose not to use Screen Time and related controls, all apps on theApp Store have already been subject to both automated and human based reviewand App Store content is subject to the 4+ age rating requirement. 7.4.90 A very significant number of apps are rejected after App Review due to concernsrelating to minors. This includes for example dating apps targeted at minors, appsintended for children with educational and quiz type features that allow users tocommunicate without a “parental gate” control, apps that fail to comply withapplicable privacy laws for minors, apps with public chat room access, and appsintended to connect users which require them to state their age, body type andgender preferences. 7.4.91 Apple’s verification system for Apple IDs created for children is appropriate, whenviewed in conjunction with its comprehensive privacy controls for all users, andadditional safeguards for children (including Apple IDs for children, Family Sharing,App Store safeguards and requirements, Screen Time use and content restrictions).This is particularly so given that the App Store is not a social media service, a servicethat seeks or offers validation, or which uses children’s data to create extensiveprofiles for advertising purposes. Apple does not collect unnecessary data that woulddetermine how old a user is, but offers numerous other protections that apply tochildren. 7.4.92 Apple will continue to monitor the EU BIK+ strategy, including the ongoing workrelating to an EU code of conduct on age-appropriate design.(g) High level of consumer protection, enshrined in Article 38 of the Charter 7.4.93 As noted Section 5 above, the protection of consumers is a foundational principle ofthe App Store. In Apple’s assessment, the collective effect of the risk mitigationmeasures detailed in Section 6 is to ensure a high level of consumer protection forend users when they engage with the App Store, which is both reasonable andproportionate in light of the level of Systemic Risks which may stem from the design,function or use of the App Store.(h) Actual or foreseeable negative effects on electoral processes (i) Risk profile 7.4.94 While online platforms can be used to disseminate false information which may giverise to risk relating to electoral processes, the likelihood of the App Store being usedfor such purposes is very substantially lower than for online platforms focussingprimarily on UGC. Indeed, Apple considers the risk in this respect to be low inabsolute terms. Non-Confidential Version 99 (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.95 The AMS Terms prohibit manipulating play counts, downloads, ratings, or reviews viaany means — such as (i) using a bot, script, or automated process; or (ii) providing oraccepting any kind of compensation or incentive. 7.4.96 The Introduction to the Guidelines states clearly that Apple strongly supports allpoints of view being represented on the App Store, as long as the apps are respectfulto users with differing opinions and the quality of the app experience is high. Anyapp including content or behaviour which violates Apple’s policies or terms will berejected. 7.4.97 Additional Guidelines requirements detailed above that relate to illegal content andhuman dignity are also relevant here. (iii) App Review 7.4.98 The App Review teams are vigilant to the issues presented around electoral processes,and work to exclude apps which are expected to be used to propagate harmful,misleading or deceptive information in connection with such processes, or apps thatpresent themselves as official campaign apps, poll worker apps, or election resourceapp. (iv) Additional specific controls 7.4.99 During an electoral cycle in any given country, the App Review team maintainsparticular vigilance with a view to ensuring that apps engaging concerns areappropriately escalated. Relevant determinations are passed downstream torecommender systems and editorial teams to ensure that only relevant and legitimateapps relating to electoral processes are being surfaced for users in stories, or inrecommendations. 7.4.100 Where events in a particular country or in connection with a particular event orsituation give rise to specific concerns regarding potential disinformation or attemptsto interfere with electoral processes, various App Store support functions, such as AppReview Policy, the App Store Legal team, or the ERB, coordinate in order to ensurethat new and emerging issues can be addressed. This may result in updated guidanceto App Store support teams, including the App Review team and local editorial teams. 7.4.101 The country teams responsible for any particular App Store storefront are highlyattuned to political trends and events in their countries of responsibility, and factorconsiderations relevant to electoral processes into editorial decisions. (v) Effectiveness 7.4.102 Apple considers that, bearing in mind its low risk profile in this respect, the App Storerisk mitigation measures are reasonable and proportionate, and are capable ofdealing effectively with any risks which may arise in connection with electoralprocesses. Non-Confidential Version 100 (i)Actual or foreseeable negative effects on civic discourse and public security (including disinformation) (i) Risk profile 7.4.103 The App Store does not give rise to the risk of negative effects on civic discourse andpublic security to the extent remotely comparable with those online platforms whosedesign, function and / or use involve the widespread dissemination and rapidamplification of content, including UGC or news. 7.4.104 The risk that user ratings or reviews of apps hosted on the App Store may negativelyaffect civic discourse, electoral processes, or public security, is low. 7.4.105 Apple notes in this respect the balance to be struck between protection of civicdiscourse against disinformation (particularly where such disinformation may give riseto material harmful effects to the public) and the protection of freedom of expressionand information, including media pluralism. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.106 To the extent that public security considerations are taken to extend to risk mitigationmeasures to identify and address illegal content or illegal conduct, these areaddressed in the terms and conditions, and applicable Guideline provisions listedabove in respect of illegal content. 7.4.107 Those Guidelines provisions listed above in respect of the rights to human dignityand respect for private and family life, and freedom of expression, are also relevantto negative effects on civic discourse and public security. (iii) App Review 7.4.108 The App Review process, including its ongoing review of live apps, includes controlsdesigned to identify apps intended to have an adverse impact on civic discourse, forexample those apps designed to disseminate extremist content or disinformation. Inpractice, Apple enforces its applicable terms and conditions in relation to matterscapable of adversely affecting civic discourse, such as inclusion of illegal content,pandemic disinformation, or terrorist content. (iv) Additional specific controls 7.4.109 The additional specific controls listed above at paragraph 7.4.93 et seq. in respect ofnegative effects on electoral processes apply also in the case of negative effects oncivic discourse and public security. (v) Effectiveness 7.4.110 Apple considers that the provisions referred to above provide it with ample basis totake action against threats to public security or civic discourse which may arise inconnection with the App Store. Non-Confidential Version 101 (j)Actual or foreseeable negative effects on gender-based violence (i) Risk profile 7.4.111 As stated in Section 5 above, the risk of the App Store being used to disseminate appshaving a potential adverse effect on gender-based violence, the probability of suchrisks crystallising and the potential impacts that may flow therefrom, are similar to therisks described above with respect to illegal content. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.112 The terms and conditions and applicable Guideline provisions listed above in respectof illegal content and the right to human dignity address negative effects on gender-based violence. Notably: (a) Section 1.1.1 of the Guidelines clearly prohibits content on the App Store thatis defamatory, discriminatory or mean-spirited, including references orcommentary about religion, race, sexual orientation, gender, national / ethnicorigin, or other targeted groups. This is particularly the case if the app is likelyto humiliate, intimidate or harm a targeted individual or group. (b) Section 1.1.2 of the Guidelines prohibits realistic portrayals of people or animalsbeing killed, maimed, tortured or abused, or content that encourages violence.The App Store’s relevant controls include these and other clear Guidelineprohibitions, and the removal of apps identified as giving rise to such risk andthe ability to alert law enforcement authorities. (iii) App Review 7.4.113 Apple refers to the App Review practices identified above with respect to: (1) thedissemination of illegal content; and (2) the rights to human dignity, and to privateand family life. (iv) Additional specific controls 7.4.114 Apple refers to the specific controls identified above with respect to: (1) thedissemination of illegal content; and (2) the rights to human dignity, and to privateand family life. (v) Effectiveness 7.4.115 Apple considers that its assessments at paragraphs 7.4.27et seq. and 7.4.43et seq.above as to the effectiveness of its risk mitigation measures relating to, respectively,dissemination of illegal content and the rights to human dignity, apply equally inrespect of the risk of actual or foreseeable negative effects on gender-based violencestemming from the design, function or use of the App Store. Non-Confidential Version 102 (k)Actual or foreseeable negative effects on public health, serious negative consequences to a person’s physical and mental well-being (i) Risk profile 7.4.116 As noted at section 5.9(e) above, risks to public and individual health do not arisefrom the use of the App Store in a manner or to an extent comparable with otheronline platforms with business models focussing on widespread dissemination andrapid amplification of UGC. In general, Apple considers the risk profile of the AppStore in this respect to be, objectively, no more than modest, while nonethelessacknowledging that were such risks to crystallise, their impact could be significant. 7.4.117 In light of the UGC content moderation controls, the risk that user ratings or reviewsof apps hosted on the App Store may produce negative effects on public health andphysical and mental well-being is low. Apple has considered the heightenedvulnerabilities of young users with regard to risks to individual health and well-being;it provides a number of controls and a support structure (for example parentalcontrols) which specifically address these risks. Given the likely impact and prevalenceof such risks, those controls are set to “on” by default or are readily available toparents to facilitate the safety of children. 7.4.118 As regards UGC, clearly, the risk of user ratings or reviews of apps hosted on the AppStore may produce negative effects on public health and physical and mental well-being is low. (ii) Terms and Conditions and Applicable App Review Guidelines 7.4.119 The AMS Terms prohibit users from posting objectionable, offensive, unlawful,deceptive, inaccurate, or harmful content in ratings and reviews. 7.4.120 The Guidelines contain multiple rules that address physical health and well-being. Forexample: (a) Section 1.4 of the Guidelines addresses app behaviour that risks physical harm. (b) Section 1.4.1 specifically addresses “medical apps”. (c) Section 1.4.2 specifically addresses “drug dosage calculators”. (d) Section 1.4.3 specifically addresses apps that “encourage consumption oftobacco and vape products, illegal drugs or excessive amounts of alcohol”. (e) Section 1.4.5 provides that apps should not urge customers to participate inactivities (like bets, challenges, etc.) or use their devices in a way that risksphysical harm to themselves or others. (iii) App Review 7.4.121 App Review seeks to ensure that all app submissions comply with the Guidelinesabove. Non-Confidential Version 103 (iv) Additional specific controls 7.4.122 Engagement with the App Store does not give rise to addiction issues that have thepotential to cause serious negative consequences to a person’s physical and mentalwell-being. To the extent that such risks arise outside of the App Store after usersdownload apps, Apple’s Screen Time functionality, referred to in Section 3, can beused by adults and vulnerable and minor users to track and control the time they arespending on particular apps. 7.4.123 Apple notes in passing that its requirement in the Guidelines (Guideline 1.2) for appswith user-generated content or social networking services to include arrangementsfor filtering objectionable material, reporting offensive content, and blocking abusiveusers provide helpful mitigation in respect of risks in this category arising from third-party apps. 7.4.124 All ratings and reviews are subject to controls to ensure that they comply with theSubmissions Guidelines, including to ensure that they do not contain objectionable,offensive, unlawful, deceptive, inaccurate, or harmful content. (v) Effectiveness 7.4.125 Apple considers its risk mitigation measures to provide it with sufficient means totake action against threats to public or individual health which may arise inconnection with the App Store.