1 Non-Confidential Version APPLE DISTRIBUTION INTERNATIONAL LIMITED App Store –Second Report on Risk Assessment and Risk Mitigation Measures pursuant toArticles 33, 34 and 35 of Regulation (EU) 2022/2065 of the EuropeanParliament and of the Council of 19 October 2022 on a SingleMarket for Digital Services and amending Directive 2000/31/EC(Digital Services Act) 27 August 2024 Non-Confidential Version 2 App Store – Report on Risk Assessment and Risk Mitigation Measures OVERVIEW This Risk Assessment Report is structured as follows:Section 1 contains an explanation of Apple’s approach to its second DSA App Store riskassessment and this report.Section 2 describes the risk profile of the App Store, with reference to its key attributes andfunctionalities. It also describes certain attributes and functionalities that exist on othercommonly used online platforms, including VLOPs, that do not exist on the App Store. Thisis intended to inform the assessment of the Systemic Risks and their risk mitigationmeasures, which is detailed in Section 3. Section 3 contains the results of Apple’s assessment of how the Systemic Risks in the EUmay stem from the design, functionality or use of the App Store, as well as Apple’sidentification and assessment of its risk mitigation measures that address those risks.Annex 1 (separate enclosure) contains background detail on App Store features, andrelevant policies, procedures and controls.\*\*\* \*\*\* \*\*\* \*\*\* \*\*\*This Report was prepared by Apple Distribution International Ltd. (“ADI”) solely fortransmission to the European Commission and the Digital Services Coordinator for Ireland, Coimisiún na Meán, pursuant to Article 42(4)(a) and (b) of the DSA. The report is confidentialand contains commercially sensitive information. It cannot be disclosed under Regulation1049/2001 as this would undermine Apple’s commercial interests, including its intellectualproperty. For the sake of completeness, Apple intends to publish a non-confidential versionof the Report, in accordance with Article 42(4), following receipt of the second DSA auditreport pursuant to Article 37(4). Non-Confidential Version 3 SECTION 1: INTRODUCTION AND BACKGROUND ........................................................................ 4 SECTION 2: APP STORE RISK PROFILE................................................................................................. 9 SECTION 3: ASSESSMENT OF SYSTEMIC RISKS AND RISK MITIGATION MEASURES...... 15 4 Non-Confidential Version SECTION 1: INTRODUCTION AND BACKGROUNDSection overview This section of the report contains an explanation of Apple’s1 approach to its second AppStore risk assessment and this report. 2024 Report and its structure This report contains the results of the second Apple App Store risk assessment, which hasbeen conducted in accordance with Article 34 of the DSA, as well as Apple’s assessment ofits App Store risk mitigation measures pursuant to Article 35 of the DSA. The report buildson the August 2023 App Store Risk Assessment (the “First App Store Risk Assessment”),which was submitted to the European Commission on 30 August 2023, and subsequently to Coimisiún na Meán. Unless stated otherwise, definitions from the First App Store RiskAssessment are adopted in this report. This report relates to Apple’s provision of the App Store service2 in the EU, which theCommission designated in April 2023 as a single VLOP.3 Sections 1 and 2 of the First App Store Risk Assessment include background information onthe App Store VLOP designation and Apple’s risk assessment methodology. That detail isnot repeated in this report, save where Apple has looked at different information sources in2024 to inform its risk assessment work.Section 3 of the First App Store Risk Assessment details certain relevant Apple-level (i.e.non-App Store specific) functions, policies and practices that apply to all of Apple’sproducts and services across the wider Apple ecosystem. These protections apply to theuse of all Apple devices, regardless of whether a user engages with the App Store, and, whilenot forming part of the design or function of the App Store itself, and the provision of the AppStore by Apple, they contribute to the overall risk environment in which the App Storeoperates. These protections are not limited to, but extend to, Apple in relation to its provision 1 Although ADI is responsible for the provision of the App Store in the EU, and for determining the purposesand means of processing personal data in the context of this provision, and considering that ADI personnelcontribute to the policies, processes and procedures relevant to the provision of the App Store in the EUand globally, for the purposes of this report, and unless otherwise stated, we do not distinguish betweenADI and Apple Inc. Instead, we refer to “Apple” policies, processes and procedures, without prejudice towhich entity is providing the actual service or product being discussed.2 At the time of publication of the First App Store Risk Assessment, Apple operated five separate App Storesin the EU (iOS App Store, iPadOS App Store, watchOS App Store, macOS App Store, and tvOS App Store).Since then, Apple has launched a separate visionOS App Store for the Apple Vision Pro device in Franceand Germany.3 ADI considers these services to be separate online platforms, which have significant material differencesfrom both a developer and end user perspective. ADI considers that only iOS App Store should have beendesignated as a VLOP. Nonetheless, in the light of the definition of App Store in the Commission’sdecision, ADI has prepared this Report on the basis that it extends to iOS App Store, iPadOS App Store,watchOS App Store, macOS App Store, tvOS App Store, and the visionOS App Store. We refer to the “AppStore” as referring to all of those services. 5 Non-Confidential Version of the App Store. These controls are not detailed in the second App Store risk assessment,save to the extent that they materially mitigate any Systemic Risks that stem from the design,function or use of the App Store, and are referred to in Section 3.Sections 4 and 6 of the First App Store Risk Assessment contain detailed backgroundinformation on the operation of the App Store and the risk mitigation measures that Appledeveloped since it launched the App Store over 15 years ago. That detail was important toprovide the background relevant to Apple’s Article 34 assessment of the Systemic Risks andApple’s Article 35 risk mitigation measures assessment, particularly for the purposes of itsfirst report. Rather than repeat that content this year, Apple has consolidated thisinformation into Annex 1, which has been updated to reflect any material changes in theoperation of the App Store and its risk mitigation measures since 28 August 2023.Sections 5 and 7 of the First App Store Risk Assessment contain respectively Apple’sassessment of the Systemic Risks and assessment of risk mitigation measures to assessthose risks. In this 2024 report, those sections are consolidated into a new section 3, withsome information presented in a table format.The First App Store Risk Assessment was drafted as at 27 August 2023. Apple now has thebenefit of data derived from additional controls and processes that it implemented toaddress its DSA obligations and additional reference points to take into account, such asthe European Commission’s Article 35 guidelines on the mitigation of systemic risks forelectoral processes. Such information has been factored into the second App Store riskassessment and risk mitigation assessment, which covers the period 28 August 2023 to 27August 2024. Ongoing DSA engagement In addition to its ongoing management of risks of the App Store, since submitting the FirstApp Store Risk Assessment, Apple has actively participated in the following DSA relatedengagements: 1. Met with the European Commission to discuss the First App Store Risk Assessment andits ongoing DSA compliance efforts;2. Received one RFI from the European Commission regarding the First App Store RiskAssessment (“ECRFI1”), 4 which it responded to in January 2024. Where relevant,information submitted to the Commission in its response has been incorporated into itsongoing risk assessment work;3. Received an RFI from the European Commission regarding Article 40(12) of the DSA(researcher access to publicly accessible data), 5 which it responded to in February 2024. 4 https://digital-strategy.ec.europa.eu/en/news/commission-sends-request-information-apple-and-google-under-digital-services-act5 An RFI was sent to 17 VLOPs, including Apple. https://digital-strategy.ec.europa.eu/en/news/commission-sends-requests-information-17-very-large-online-platforms-and-search-engines-under Non-Confidential Version 6 4. Monitored and considered requests for information issued to other VLOPs;5. Provided the First App Store Risk Assessment to, and discussed it with, Coimisiún naMeán;6. Engaged with the European Commission and other stakeholders in connection with theCommission’s Article 35 guidelines on electoral processes, which included attending anin-person “stress test” event, the provision of information to the European Commissionon elections readiness, and attending three additional online meetings in the leadup toand following the European Parliamentary elections;7. Engaged with the European Commission on the protection of minors;8. Engaged in European Commission consultations regarding various aspects of the DSA,including the draft transparency report delegated act and related templates; 6 9. Engaged via trade associations regarding the European Commission’s Article 28guidelines consultation; and10. Attended the June 2024 DSA Risk Assessment stakeholder event, organised by theGlobal Network Initiative and Digital Trust and Safety Partnership.Feedback and additional insights on the conduct of DSA Risk Assessments from theseengagements has been factored into the second risk assessment exercise. DSA processes, systems and controls As detailed in the First App Store Risk Assessment, Apple established a number of newprocesses, systems and controls in connection with its DSA obligations. These include: 1. Establishment of a dedicated DSA Compliance function. In the last 12 months the DSACompliance function has:a. continued to develop DSA risk management and escalation processes, includingprocesses to provide regular updates to the ADI Board, in conjunction with AppStore Legal;b. developed a DSA training program for business functions that have a role inmitigating risks on the App Store; andc. engaged an audit firm in connection with, and organised and supervised activitiesrelating to, the Year 1 DSA Audit. 7 6 Commission Implementing Regulation (EU) .../... of XXX laying down templates concerning thetransparency reporting obligations of providers of intermediary services and of providers of onlineplatforms under Regulation (EU) 2022/2065 of the European Parliament and of the Council /https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/14027-Digital-Services-Act-transparency-reports-detailed-rules-and-templates- en 7 The Year 1 DSA Audit refers to the independent audit that ADI is required to procure annually, pursuant toArticle 37 of the DSA. Non-Confidential Version 7 2. Creation of an enhanced illegal content reports portal (“Contents Report Portal”) 8 andrelated systems to track and monitor notices and responses;3. Establishment of transparency reporting processes. Apple has published two DSA AppStore Transparency reports, since the First App Store Risk Assessment was finalised; 9 4. Creation of a new DSA Legal webpage, 10 with various data points including points ofcontact information, a link to the Contents Reports Portal, Transparency reports and theAdvertising repository.5. The creation of a new process for DSA researcher data access requests; and6. Establishing a process to obtain information from developers who are “traders” in theEU in order to comply with Article 30 of the DSA. Internal stakeholder engagement DSA Compliance and App Store Legal worked with various teams and business functionsresponsible for App Store policies, procedures and controls, to understand if and how therisk profile of the App Store and its risk mitigation efforts have changed / been tested overthe last 12 months. This includes engagement with teams responsible for the followingfunctions:1. App Review2. Recommender Systems3. Apple Search Ads4. Trust and Safety5. Privacy Compliance External stakeholder engagement Apple routinely engages with external stakeholders in connection with the operation of itsservices, including via teams dedicated to external engagement and its Government Affairspersonnel.In addition, as described in the First App Store Risk Assessment, senior personnel withineach function in the App Store (and who were consulted in connection with this riskassessment) are highly attuned to current events and external commentary affecting theApp Store and their functions in particular. They take account of such events andcommentary in making ongoing improvements to risk mitigation measures that they areresponsible for. Teams across Apple conduct direct engagement with government bodies,NGOs, relevant trade bodies and interest groups, as well as the press. They are also awareof and responsible for considering concerns raised by the extensive App Store developercommunity and its users. Apple’s support and engagement in initiatives such as Safer 10 9 8 https://contentreports.apple.comDSA Transparency reports are available here: https://www.apple.com/legal/dsa/ie/. The third AppStore DSA Transparency report will be published on 30 August 2024. https://www.apple.com/legal/dsa/ie/ 8 Non-Confidential Version Internet Day provide a broader platform for discussions across the EU with Governmentagencies, NGOs, trade associations, and the general public.Concerns and issues raised have been considered as par t of this year’s risk assessmentefforts. Overall observations As detailed above, Apple has continued to engage extensively on matters relating to the DSAin connection with the App Store since it finalised the First App Store Risk Assessment andthe VLOP obligations came into effect. This includes direct engagement with the EuropeanCommission and Coimisiún na Meán, participation in a number of consultation processes,and active monitoring of issues highlighted in requests for information issued to andenforcement proceedings regarding other VLOPs. Despite its low election interference riskprofile, Apple actively participated in the European Commission’s elections interferencestress test and follow-up stakeholder events. This engagement reinforced Apple’s approachto the assessment of its risk profile and the adequacy of its risk mitigation measures.Having now concluded its second assessment of the Systemic Risks and the adequacy ofits risk mitigation measures, Apple is satisfied that the conclusions it reached as part of itsFirst Risk Assessment are sound. Nothing in the intervening period has caused Apple todoubt the robustness of its First Risk Assessment or the adequacy of its risk mitigationmeasures. Apple actively monitors and manages the Systemic Risks and continues itsconsiderable efforts to render the App Store a safe and trusted place for users to discoverand download apps. 9 Non-Confidential Version SECTION 2: APP STORE RISK PROFILESection overview This section of the report describes the risk profile of the App Store, with reference to its keyattributes and functionalities. It also describes certain attributes and functionalities thatexist on other large online platforms, including VLOPs, that do not exist on the App Store.11 This is intended to inform the ongoing assessment of the Systemic Risks and their riskmitigation measures, which are detailed in Section 3. Core App Store attributes and functionalities For DSA purposes, “recipients of the service” are: 1. Developers of apps; and2. End users (also referred to as users).Developers appoint ADI as their commissionaire for the marketing and delivery of apps toend users in the EU. Those end users are users of Apple devices who discover and downloadapps in the App Store in the EU.The App Store operates 175 region-specific “storefronts”, and users transact through astorefront based on their home country. Each EU Member State has a separate storefront.The App Store is available in 40 languages, including 17 official languages of the EU.Information presented in the App Store is therefore “localised”, such that app metadata isdisplayed in different languages, depending on a user’s location and language settings.From its inception, the App Store was designed in such a way as to protect users of Appledevices by creating a safe and trusted environment offering a wide variety of curated apps.Every app and every app update submitted to the App Store is closely reviewed by bothautomated systems and human experts trained to review apps offered on the App Store forsafety, user privacy and approved business models, such that they provide a good userexperience. This pre-publication review already sets the App Store apart from other onlineplatforms, where content can be posted without any prior checks. Post publication, appsare subject to ongoing monitoring and multiple controls to enable Apple to take action whenit is alerted to problematic developers or apps. App Store content types As detailed in Annex 1, there are four types of content on the App Store that users canaccess, and therefore where, in principle, users could be exposed to illegal content andother risks. These content types are described below, as well as inherent design limitations,which feed into Apple’s assessment of how the Systemic Risks could arise from the design,function and use of the App Store. 11 For the avoidance of doubt, the risk profile of the App Store as described in the First App Store RiskAssessment has not changed. The information in this section is supplemental to the risk profile detail setout in that report. 10 Non-Confidential Version In addition, we describe below how, in principle, users could be exposed to illegal orproblematic content in connection with each content type. For the reasons describedbelow, amongst the four content types on the App Store, the greatest risk of exposure tocontent giving rise to relevant risks arises in connection with apps and related product pageinformation. As such, a key risk mitigation measure on the App Store is Apple’s App Reviewprocess. (a) Apps and related product page information Worldwide, the App Store hosts approximately 1.8 million apps, which are available fordownload by users. Apps are recorded against different app “categories”, which includebooks, business, music, navigation, games, entertainment, productivity and food and drink.When a user taps on an app during discovery, they are taken to the app product page, whichprovides information about the app. Most of the information on the app product page is inputby the developer, such as developer and app information; app icons, screenshots, andpreviews; a privacy policy URL; support links; an age rating; and data handling practices.App product pages also contain user ratings and reviews (see below).All apps available on the App Store, including most of the information that appears on appproduct pages, have already been submitted to and approved by App Review. As detailed inAnnex 1, App Review involves, in every case, an automated element and a human reviewelement. A key differentiator with other types of online platforms, including social mediaplatforms, is that all apps and app metadata have been subject to review prior to theirpublication on the online platform.As described in the First App Store Risk Assessment, absent any risk mitigation measures,an app store could be used to disseminate certain categories of illegal content to users inthe EU, including: 1. apps designed to disseminate illegal content or facilitate illegal behaviours, such asfraud, including “bait-and-switch” apps, or apps that are designed to underminefundamental rights;2. apps that infringe the intellectual property rights of others; and3. apps that facilitate activities that are illegal in certain Member States (for example,certain types of real money gambling).The First App Store Risk Assessment also referred to the risk that in-app content could bedefamatory or intended to offend. Apple cannot monitor such content but instead requiresdevelopers to ensure that they have controls in place for users to report such content (seebelow). (b) User ratings and reviews User ratings and reviews are the only type of content on the App Store that can be generatedby end users of the service. 11 Non-Confidential Version Users can post a star rating of between 1 to 5 stars, a review “title” and the review itself.Users cannot post images or videos, such that risks arising on other online platforms suchas “deepfakes” and other images that are offensive or discriminatory do not arise in userratings and reviews on the App Store. When an end user edits their rating or review, the mostrecent change will display on the relevant product page. If a user submits a new rating orreview, the existing review is replaced.Developers are also able to post responses to user ratings and reviews. No posting of imagesor videos is possible.As detailed in Annex 1, ratings and reviews are subject to terms and conditions (the AMSTerms) and pre- and post-publication controls, including pre-publication scanning andpost-publication removal.Ratings and reviews are not themselves “recommended” by the App Store recommendersystems. Instead, consolidated ratings are an input to recommender systems that highlightor profile particular apps to users.In principle, users could be exposed to illegal content posted by other users, although inpractice, the primary risk regarding user ratings and reviews relate to fake reviews (althoughApple has effective controls in place that are aimed at addressing this risk). (c) App Store editorial content App Store editorial content is drafted by human App Store editorial teams. App Storeeditorial teams create a curated catalogue of apps for each category in the Today tab (forexample, original stories, tips, how-to guides, interviews, App of the Day, Game of the Day,Now Trending, Collections, Our Favourites, Get Started). For each curated category, theeditorial teams determine whether to “pin” certain categories in designated verticalpositions on the Today tab landing page.From time to time, App Store editorial teams also write content about local events. Forexample, in connection with the European Parliamentary elections in June 2024, in closecooperation with the European Parliament, App Store editors published content withinformation for users about the elections,12 including localized information about apps andnews sources.All App Store editorial content is subject to internal editorial guidelines.In principle, users could be exposed to illegal and problematic content posted by App Storeeditors, although in practice the overall risk of this content type posing issues is very low,not least because of the small number of Apple personnel who are responsible for draftingcontent and the subject matter(s) they write about. (d) Apple Search Ads 12 See https://www.europarl.europa.eu/news/en/press-room/20240507IPR21413/weekly-election-highlights and https://apps.apple.com/be/story/id1745174009. 12 Non-Confidential Version Apple Search Ads are the only type of advertising on the App Store.13 Apple Search Adsprovide a means for third-party developers to increase the visibility of their apps that arealready distributed on the App Store.Apple Search Ads placements are clearly distinguished from organic App Store placementsand search results with a prominent “Ad” mark (language localised), and may include borderand background shading demarcations. Tapping on the “Ad” mark designation displays an“About this Ad” sheet, which provides information about why the user has been shown thatparticular Apple Search Ad and what criteria, if any, were used to display the app campaign.Apple Search Ads is an entirely optional service for developers, accessible through anseparate account (an Apple Search Ads account), using a different web portal from AppStore Connect.Apple Search Ads differ from traditional forms of online advertising, that may be present onother large online platforms, in that only pre-approved apps can be advertised. Thus, thereis no ability to advertise non-apps, including physical goods or services, on the App Store.Apple Search Ads are subject to additional terms and conditions (beyond the DPLA and AppReview Guidelines), which are actively enforced.In practice, the risk profile for Apple Search Ads is largely the same as for apps, althoughthere is a moderate risk that Apple Search Ads could advertise content to users that is illegalto advertise in their home country or region. Locations where users encounter content on the App Store (a) Today tab The Today tab contains App Store Editorial content (see above) and “Top” charts (apps areselected for charts based on the most downloads in the App Store within approximately thepast 24-hour period). Editorial content can be “personalized” based on e.g. purchase ordownload behaviour in the App Store. (b) The “Games” and “Apps” tabs The Games and Apps tabs on the App Store provide dedicated experiences for games andapps that inform and engage customers through recommendations on new releases andupdates, videos, top charts, and handpicked collections and categories. For these tabs, allapps are selected based on algorithmic relevance, App Store Editorial curation, and topcharts. (c) Search tab The App Store Search tab provides an additional way for customers to find apps, games,stories, categories, in-app purchases, and developers. Before a user enters a search, the 13 [CONFIDENTIAL] 13 Non-Confidential Version Search tab shows popular or trending queries in the “Discover” section, as well as a list ofapps that a user may want to search for in the “Suggested” section. These apps are selectedbased on aggregate search behaviour from information curated by Apple’s editors. In somecases, suggested queries may be personalized for users in the “Discover” section and appsmay be personalized for users in the “Suggested” section, based on prior engagement in theApp Store. In sum, the apps shown in Search before a search term is entered are selectedbased on algorithmic relevance, App Store Editorial curation, and top charts.Searches use metadata from developers’ product pages to deliver the most relevant results.The main parameters used for app ranking and discoverability are the relevance of text /titles, keywords, and descriptive categories provided in the app metadata; user engagementin the App Store, such as the number and quality of ratings and reviews; and applicationdownloads. Date of launch in the App Store may also be considered for relevant searches. Third party UGC The First App Store Risk Assessment details at paragraphs 5.5.1 to 5.5.4 the limits of the AppStore risk profile, and the scope of Apple’s obligations under the DSA. In particular, itexplains that where risks arise within the app itself, and therefore outside the App Store, thatis the responsibility of the developer, some of which will be online platforms and VLOPsthemselves.Apple has also explained to the European Commission previously that UGC on third partyapps is outside the scope of its content moderation obligations under the DSA. Apple hasno means to enforce any rules it might seek to adopt in connection with live moderation ofsuch UGC, as it does not control content within third-party apps. Apple cannot reasonablybe expected to monitor and police UGC on third-party apps.14 Pursuant to the App Storeterms and conditions applicable to developers, including the App Review Guidelines,responsibility for moderating UGC on third party apps is clearly a matter for the developersof those apps. Additionally, any developers that are “intermediary services” under the DSAmay have their own legal obligations with regard to content moderation of their apps. Thisincludes several developers that themselves operate apps that have been designated asVLOPs.Apple reasonably can, and does, however, maintain and enforce contractual obligations fordevelopers that wish to have access to the App Store and wish to allow UGC on their apps.Pursuant to Guideline 1.2, apps with UGC or social networking services must include: 1. a method for filtering objectionable material from being posted to the app;2. a mechanism to report offensive content and timely responses to concerns;3. the ability to block abusive users from the service; and 14 Indeed, Recital 30 of the DSA, for example, states "Nothing in this Regulation should be construed as animposition of a general monitoring obligation or a general active fact -finding obligation, or as a generalobligation for providers to take proactive measures in relation to illegal content." This is also reflected inArticle 8 “No general monitoring or active fact finding obligations ”. 14 Non-Confidential Version 4. published developer contact information. Attributes and functionalities that do not apply to the App Store To focus the risk profile of the App Store and to distinguish it from other large onlineplatforms, including other VLOPs, it is important to note that the below common features orcharacterisations do not apply to the App Store:It is not a social media platform or online marketplace for physical goods and services, amessaging service, a pornographic content service, an online chat or discussion service, ora file storage or sharing platform.To the extent that developers can set up accounts as part of the Apple Developer Program,they are subject to checks and controls that significantly limit the risk of the creation of“fake” accounts.It is not a service where users can share content anonymously with other users, save thatusers can post ratings and reviews using a nickname.The following features do not exist on the App Store: 1. One to one end-user chat (whether encrypted or unencrypted). As such, risks to minorsand other users that arise in connection with the use of private chat that are a feature ofother online platforms do not arise on the App Store;2. The ability to form closed or small groups of users. As such, risks that arise from theability of users when they can form closed or small user groups, for example risks tominors or other vulnerable individuals, do not arise on the App Store;3. The ability for users to livestream content. As such, risks that arise from the ability tolivestream content, which often cannot be or is not moderated in real time, do not ariseon the App Store;4. The ability for users to post images or videos or engage in concerted contentdissemination.The App Store is not a general news service / information source (beyond containinginformation about apps, and some limited information regarding current events (forexample, the 2024 Paris Olympics)). As such, any risks of “disinformation” on the App Storeare in no way comparable to other online platforms that are used to disseminate news andother general factual information to the public. 15 Non-Confidential Version SECTION 3: ASSESSMENT OF SYSTEMIC RISKS AND RISK MITIGATION MEASURESSection overview This Section contains the results of Apple’s assessment of how the Systemic Risks in the EUmay stem from the design, functionality or use of the App Store, as well as Apple’sidentification and assessment of risk mitigation measures that address those risks. In the First App Store Risk Assessment Apple concluded that it did not identify anymeaningful basis to distinguish risks stemming from the design and function of the AppStore from risks stemming from its use. Apple also concluded that it did not identify any risksin the EU beyond or separate from those listed in Article 34(1) that might reasonably be saidto stem from the design and function of the App Store, or its use, and that might reasonablybe said to be systemic in nature. Nothing in the intervening 12 months has caused Apple toreach different conclusions. Approach to assessing the Systemic Risks and their mitigation In this report the results of Apple’s assessment of each Systemic Risk and related riskmitigation measures are set out in table format at the end of this section.For each Systemic Risk, the table contains: 1. Inherent Risk: Any risk that may stem from the design, functioning or use of an app storewithout reference to risk mitigation measures. This risk is directly linked tofunctionalities and attributes of the App Store, and recipients of the service, describedin Section 2. This includes both the level of risk and a summary of the rationale for theclassification.2. Risk Mitigation Measures: Measures that are in place on the App Store that mitigate theinherent risk.3. Performance metrics and other risk indicators: This lists the performance metrics andother risk indicators that Apple monitors to assess the level of residual risk.4. Residual Risk: This records the level of risk after risk mitigation measures and theperformance metrics and other risk indicators have been factored into the assessment.5. Observations regarding controls effectiveness: This records Apple’s observations onwhether it considers its controls are effective in addressing the relevant risk.Inherent risks and residual risks are categorized as low, medium or high. This reflectsApple’s assessment of risk, factoring in: (a) the nature of the risk; (b) the probability of therisk occurring (improbable, probable, highly probable); and (c) the severity of the risk if itcrystalises (low impact, moderate impact, high impact). Probability and severitydeterminations are then combined and reflected in the risk determinations set out in thetables below. Article 34(2) factors 16 Non-Confidential Version Pursuant to Article 34(2) first paragraph, Apple is required to take account of whether andhow certain specified factors may influence any of the Systemic Risks. (a) Recommender systems and other algorithms Recital 84 of the DSA states that “ where the algorithmic amplification of informationcontributes to the Systemic Risks”, this should be reflected in VLOP’s risk assessments.Apple makes limited use of recommender and other algorithmic systems, but end users ofthe App Store do receive recommendations with respect to a selected and limited set ofapps on the App Store that have already been approved through the App Review process.There is also a limited search function on the App Store, which allows users to search forApp Review approved apps and content, and which operates by algorithmic means. Somecontent placement can be “personalised”, but users are given the choice to disablepersonalised recommendations (except for Child Accounts,15 where recommendationscannot be personalised).Controls detailed in Annex 1 ensure that any impact of the App Store’s use of recommendersystems or other algorithmic systems on the Systemic Risks involves ample and specific riskmitigation; in particular, Apple is confident that its current controls regarding the operationof its recommender systems are such that those systems do not lead to the amplification ofinformation or disinformation that contributes to the Systemic Risks.Apple teams responsible for its recommender systems have confirmed that during the lasttwelve months Apple has maintained the controls set out in the First App Store RiskAssessment. As reflected in Annex 1, during the last 12 months Apple made changes todefault personalization settings for users under the age of 18. Previously, for ChildAccounts, personalization was set as “off”; now the default is set to “off” for Child Accountsand Teen Accounts.16 Teen Account users can elect to set personalization requirements to“on”. During the last 12 months Apple has also made modifications to restrict the types ofcontent that will be personalized for Teen Accounts. Content moderation systems Apple has in place various content moderation systems on the App Store, which are detailedin Annex 1 and categorized in its DSA transparency reports. During the period covered by thisreport, Apple has maintained the content moderation systems detailed in the First App StoreRisk Assessment. Content moderation relating to published apps Apple continually trains and enhances its automated tools to address new and emergingthreats and to factor in learning from human-based decision-making, and enhances theprocesses used by and tools available to human app review specialists. 15 “Child Accounts” refer to accounts for users under 13 years of age (or the equivalent minimum age of validconsent without required parental approval).16 “Teen Accounts” refer to accounts for users under 18 years of age (or the equivalent age of majority). 17 Non-Confidential Version Content moderation relating to published ads See further below regarding “Systems for selecting and presenting advertising”. Content moderation relating to user ratings and reviews as well as developerresponses Apple continues to train and enhance the systems it uses to identify problematic userratings and reviews. In the last 12 months Apple has deployed enhancements in machinelearning, automation, and the human review process to detect and remove problematiccontent. Apple has established on-going efforts using machine learning models to monitornew types of fraud in user reviews. Moderation tools have also been enhanced to improveefficiency and transparency, including impacted users notifications. Dedicated moderationresources are regularly reviewed to evaluate the timeliness and quality of Apple's removalprocesses, including operational process enhancement in removing illegal and unsafecontent identified in user reviews. (b) Applicable terms and conditions Apple maintains comprehensive terms and conditions – applicable to both developers andend users – that address key risks facing the App Store, including the Systemic Risks. Theterms and conditions provide Apple with a basis for taking prompt action in the event that adeveloper or end user misuses the App Store. Developers and users who object to suchaction have recourse to various complaints mechanisms.Apple made changes to both the DPLA and the AMS Terms in response to its DSA obligations.Based on its experience over the last 12 months, Apple remains confident that its terms andconditions provide it with a sound basis for taking action where necessary, against bothdevelopers and end users, to mitigate the impact of any Systemic Risks. This includesdeveloper and end user account terminations, app rejections and takedowns, and theremoval of user ratings and reviews. Both developers and end users have recourse tocomplaints and/or appeal mechanisms if they disagree with actions Apple takes againstthem. (c) Systems for selecting and presenting advertising Recital 88 provides that “The advertising systems used by [VLOPs...] can also be a catalyserfor the systemic risks”.As detailed in Section 2, the only advertising on the App Store is made possible by usingApple Search Ads. Use of Apple Search Ads is subject to controls and in any event do notcontain any “new” advertising content; this is a system that developers can use to promoteapps that have already been approved by App Review. As such, Apple does not consider thatApple Search Ads can to any meaningful extent be reasonably or objectively said to be acatalyser for the Systemic Risks.Apple further notes that Recital 79 to the DSA suggests that the way in which VLOPs “designtheir services is generally optimized to benefit their often advertising-driven business 18 Non-Confidential Version models and can cause societal concerns.” Although certain VLOPs may design theirservices in this way, it is certainly not the case for the App Store, where Apple Search Adsonly provides developers an opportunity to promote their apps and not to “advertise”additional content. The promoted apps have already been reviewed and approved for theApp Store and are subject to further review to confirm that they are not in violation of theApple Search Ads terms and conditions.Apple now publishes its online Ads Repository, which is accessible here:https://adrepository.apple.com. This lists the Apple-delivered ads on the App Store in EUstorefronts, as well as “Restricted Advertising”, which lists both account suspensions aswell as the Apple-delivered ads on the App Store that were removed from EU storefrontsafter publication, due to terms and conditions violations.The Apple team responsible for Apple Search Ads have confirmed that during the last twelvemonths Apple has maintained the controls set out in the First App Store Risk Assessment. Data-related practices of the provider Apple’s data-related practices are a central differentiator of the App Store, and the wholeApple ecosystem; Apple provides its customers with market-leading standards ofprotection of privacy, complying in full with applicable data protection and privacy laws.This risk assessment, including the assessment of the Charter right to the protection ofpersonal data below, addresses extensively all relevant privacy and data protectionconsiderations.Teams responsible for App Store data related practices have confirmed that the controlsdetailed in Annex 1 and the First App Store Risk Assessment remain in place. Intentional manipulation of the App Store Pursuant to Article 34(2) second paragraph, Apple is required to analyse how the SystemicRisks are influenced by intentional manipulation of the App Store.17 Malicious actors are constantly seeking to circumvent App Store risk mitigation measuresso as to publish or promote apps on the App Store. Where relevant, particularly with respectto “illegal content”, Apple has addressed and factored such intentional manipulation intoits risk analysis.The results of Apple’s 2023 efforts to reduce the occurrence of fraud on the App Store aredetailed in Apple’s 2024 fraud prevention analysis.18 In summary, it states that:1. As digital threats have evolved in scope and complexity over the years, Apple hasexpanded its antifraud initiatives to address these challenges and help protect itsusers. Every day, teams across Apple monitor and investigate fraudulent activity on 17 Recital 84 provides further context, which Apple has factored into its assessment.18 https://www.apple.com/newsroom/2024/05/app-store-stopped-over-7-billion-usd-in-potentially-fraudulent-transactions/ Non-Confidential Version 19 the App Store, and utilise sophisticated tools and technologies to weed out bad actorsand help strengthen the App Store ecosystem.2. From 2020 through 2023, Apple prevented a combined total of over $7 billion inpotentially fraudulent transactions, including more than $1.8 billion in 2023 alone. Inthe same period, Apple blocked over 14 million stolen credit cards and more than 3.3million accounts from transacting again.3. In 2023, Apple:a. Terminated ca. 118,000 developer accounts for potentially fraudulent activity;b. Blocked ca. 91,000 fraudulent developer accounts from being created; andc. Deactivated ca. 374 million fraudulent customer accounts. Regional or linguistic aspects Pursuant to Article 34(2) third paragraph, Apple is also required to take into account specificregional or linguistic aspects, including any that are specific to a particular Member State,when assessing the Systemic Risks. Recital 84 provides that “Where risks are localised orthere are linguistic differences”, VLOPs should account for this in their risk assessments.Apple does not consider that regional or linguistic aspects have a material impact on theSystemic Risks that might reasonably be argued to stem from the App Store, and has seennothing during the course of the last 12 months to suggest the contrary. The App Store isavailable in 40 languages. While individual storefronts may address users in or with aconnection to particular Member States, and while linguistic and local editorial coverage isprovided across those regions and languages, the App Store service and risk mitigationmeasures are not substantively variegated across the EU, other than as may be required bylaw. Performance metrics App Store uses to monitor systemic risks [CONFIDENTIAL] The “performance metrics” listed in the tables below refer to a range of metrics andinformation sources that Apple collects and monitors in connection with its ongoingmanagement of the Systemic Risks to inform its assessment and management of theresidual risks and the effectiveness of its listed risk mitigation measures. These include: (a) App Review metrics This includes app rejections and approvals, as well as appeals and reinstatement metrics.These metrics contribute to Apple’s ongoing assessment of risk and the effectiveness of its 20 Non-Confidential Version risk mitigation measures, particularly as they relate to App Review. These are published innon-DSA App Store transparency reports.19 (b) Content moderation metrics This includes measures taken in connection in the EU with published apps, ratings andreviews and Apple Search Ads. These details are published in Apple’s DSA Transparencyreports. These metrics contribute to Apple’s ongoing assessment of risk and theeffectiveness of its risk mitigation measures. (c) Article 9 orders, and non-DSA takedown notices This includes Article 9 and non-DSA takedown notices. Apple tracks and monitors all suchnotices. It reports Article 9 orders in its DSA Transparency reports. During the period coveredby this report, Apple received one Article 9 order. Apple considers the absence or lownumber of Article 9 and take down notices such orders to be a relevant metric in assessingrisks on the App Store. (d) Article 16 illegal content notices This includes reports of alleged illegal content, via the Content Reports Portal. Apple tracksand monitors all Article 16 notices, both with respect to their substance and processingtimes, as part of its DSA compliance efforts. It also reports on such detail in its DSATransparency reports. Again, these notices assist Apple in its ongoing assessment of riskand the effectiveness of its risk mitigation measures. (e) External feedback and commentary This includes, but is not limited to, any feedback that may be received directly from theEuropean Commission and Coimisiún na Meán, civil society groups and researchers, as wellas publicly available information about issues impacting the Systemic Risks and how theymight arise in the EU, both on other platforms and the App Store.By way of example, with respect to the Article 34(1)(c) Systemic Risk relating to actual orforeseeable negative effects on electoral processes, Apple has had reference to theEuropean Commission’s guidelines on elections interference, the European Commission’sreport on Russian disinformation campaigns,20 news articles and reports regarding attemptsto interfere in the European Parliamentary elections, publications from the EuropeanParliament about such attempts, and information about risks and mitigation measuresApple learned during the election interference stakeholder events. 19 Non-DSA App Store Transparency Reports and their supporting data are available here:https://www.apple.com/legal/more-resources/20 “Application of the risk management framework to Russian disinformation campaigns” EuropeanCommission, August 2023: https://op.europa.eu/en/publication-detail/-/publication/c1d645d0-42f5-11ee-a8b8-01aa75ed71a1/language-en 1 Non-Confidential Version Annex 1 – Overview of App Store features and relevant policies,procedures and controls Overview This Annex provides an overview of the lifecycle of an app in the App Store – including appdiscovery, where users learn about and download apps. This Section also summarises thestages before app discovery: developer onboarding; app review; and recommender,advertising, and moderation systems that impact the presentation of apps and reviews tocustomers. This Section also addresses the App Store’s notice and action mechanisms,which help to mitigate potential App Store risks, as well as external risks that are theresponsibility of developers, and Apple’s DSA Compliance function, website and its DSAtransparency reports. The App Store provides app discovery and distribution Developers appoint ADI as their commissionaire for the marketing and delivery of apps toend users in the EU. Those end users are users of Apple devices who discover and downloadapps in the App Store, through one of the five landing pages (tabs) – “Today”, “Games”,“Apps”, “Arcade”, and “Search” – or by visiting the product page of an app.1 Below is an overview of how App Store discovery works from the end user’s perspective, andwhere they encounter content in the App Store that could in principle engage the SystemicRisks.The App Store operates 175 country- or region-specific “storefronts”, and users transactthrough a storefront based on their home country. Each EU Member State has a separatestorefront.2 The App Store is available in 40 languages, including 17 official languages of theEU.3 Information presented in the App Store is therefore “localised”, such that appmetadata4 is displayed in different languages, depending on a user’s location and languagesettings. Editorially curated content (described below) may vary, depending on a user’slocation. The “Today” tab The Today tab is the first page a user sees when they click on the App Store icon on theirdevice. Apple considers this a “daily destination” with original stories from App Storeeditors, featuring exclusive premieres, new app releases, Apple’s all-time favourite apps, an“App of the Day”, a “Game of the Day”, and more. It offers tips and how-to guides to helpcustomers use apps in innovative ways, and showcases interviews with inspiringdevelopers. Stories are selected based on curation by the App Store Editorial team, and theyshare Apple’s perspective on apps and games and how they impact users’ lives, usingartwork, videos, and developer quotes to bring apps to life. 1 There i s some variation between the tabs available on each App Store. The five tabs listed in thisparagraph appear on the iOS and iPadOS App Stores.2 For App Store availability in EU storefronts, see https://support.apple.com/en-us/HT2044113 https://developer.apple.com/localization/4 In this Report, app metadata comprises text (such as title, descriptions and keywords) and visuals (suchas icon, screenshots and video) that are shown in the App Store. 2 Non-Confidential Version App Store editors create a curated catalogue of apps for each category in the Today tab (forexample, original stories, tips, how-to guides, interviews, App of the Day, Game of the Day,Now Trending, Collections, Our Favourites, Get Started). For each curated category, theEditorial team determines whether to “pin” certain categories in designated verticalpositions on the Today tab landing page.The Today tab also features “Top” charts, such as Top Free Games and Top Paid Games withvarious categories (AR Games, Indie Games, Action Games, Puzzle Games, Racing Games,Simulation Games); Top Free Apps and Top Paid Apps with various categories (Apple WatchApps, Entertainment, Health \& Fitness, Kids, Photo \& Video, Productivity); Top PodcastingApps; and Top Arcade Games. Apps are selected for charts based on the most downloadsin the App Store within approximately the past 24-hour period.App Store editors can also choose to have categories personalised for the user based onprior engagement (for example, purchase or download) behaviour in the App Store. If a storyhas been personalised, the Today tab would surface and order stories that are most relevantbased on a user’s purchase and download history. For example, personalised storiesrelated to games may be surfaced as relevant to users who recently downloaded apps in thegames category. The “Games” and “Apps” tabs The Games and Apps tabs on the App Store provide dedicated experiences for games andapps that inform and engage customers through recommendations on new releases andupdates, videos, top charts, and handpicked collections and categories. For these tabs, allapps are selected based on algorithmic relevance, App Store Editorial curation, and topcharts.When considering apps to feature in these tabs, App Store editors look for high-quality appsacross all categories, with a particular focus on new apps and apps with significant updates. “Arcade” tab The Arcade tab in the App Store features games which are made available as part of Apple’ssubscription service “Apple Arcade”. Search tab The App Store Search tab provides an additional way for customers to find apps, games,stories, categories, in-app purchases, and developers. Before a user enters a search, theSearch tab shows popular or trending queries in the “Discover” section, as well as a list ofapps that a user may want to search for in the “Suggested” section. These apps are selectedbased on aggregate search behaviour from information curated by Apple’s editors. In somecases, suggested queries may be personalised for users in the “Discover” section and appsmay be personalised for users in the “Suggested” section, based on prior engagement in theApp Store. In sum, the apps shown in Search before a search term is entered are selectedbased on algorithmic relevance, App Store Editorial curation, and top charts.Searches use metadata from developers’ product pages to deliver the most relevant results.The main parameters used for app ranking and discoverability are the relevance of text /titles, keywords, and descriptive categories provided in the app metadata; and userengagement in the App Store, such as the number and quality of ratings and reviews and 3 Non-Confidential Version application downloads. Date of launch in the App Store may also be considered for relevantsearches. App product page When a user taps on an app during discovery, they are taken to the app product page, whichprovides information about the app.Most of the information on the app product page is input by the developer, such as developerand app information; app icons, screenshots, and previews; a privacy policy URL; supportlinks; an age rating; and data handling practices. The App Store also provides customerrating and review information on the app product page. This is the only UGC on the AppStore. If the user has downloaded the app, they see a link to the Report a Problem feature,which lets customers request a refund, report a quality issue, report a scam or fraud, orreport offensive, illegal or abusive content. Apple’s paid app placement option on the App Store (Apple Search Ads) Developers may also engage in paid promotion of their apps in the App Store through AppleSearch Ads which provides a means for third-party developers to increase the visibility oftheir apps that are already distributed on the App Store. Through Apple Search Ads, appsmay be displayed in the Today tab; the Search tab and Search Results; and in the productpage while browsing.Apple Search Ads placements are clearly distinguished from organic App Store placementsand search results with a prominent “Ad” mark (language localised), and may includeborder and background shading demarcations. Tapping on the “Ad” mark designationdisplays an “About this Ad” sheet, which provides information about why the user has beenshown that particular Apple Search Ad and what criteria, if any, were used to display the appcampaign.Apple Search Ads is a purely optional service for developers, accessible through anindependent account (an Apple Search Ads account), using a different web portal from AppStore Connect.5 Apple Search Ads were made available to users in certain EU storefrontsfive years ago; more were added thereafter.6 Today, Apple Search Ads are available to usersin most EU storefronts,7 though only a small percentage of App Store developers choose topromote their apps using Apple Search Ads. If developers choose to not use the AppleSearch Ads service to promote their app, their app will still appear across the variousavailable organic placements of the App Store, including within search results, just as itwould if the developer had chosen to use Apple Search Ads for securing promotedplacements. The two services and placement algorithms work separately from each other. App Store processes and functions h elp to provide a sa fe and trusted place forcustomers to discover and download apps The content below provides a summary of the App Store process from a developerperspective.Developers are screened and must agree to terms and conditions 5 App Store connect is a developer tool where developers upload, submit, and manage their apps.6 https://searchads.apple.com/countries-and-regions7 Apple Search Ads is not available to users on the Bulgaria, Estonia, Latvia, Lithuania, Luxembourg, Malta,Slovakia, or Slovenia storefronts. 4 Non-Confidential Version (i) Sanctions screening Apple conducts sanctions screening for all developers who wish to join the Apple DeveloperProgram. Developer names and contact details are run against government consolidatedsanctions lists. Two types of sanctions screenings are conducted: one for individuals, basedon information submitted in the Developer Information Page, and one for organisations,based on information submitted in the Enrolment Information page of the enrolment.Where a sanctions report contains a positive hit and the developer challenges a positivesanctions determination, the Global Export Sanctions Compliance team will seek moreinformation from the developer. They then factor that additional information into any finaldetermination.Apple also conducts ongoing sanctions monitoring to ensure that developers who arealready admitted to the Apple Developer Program have not been added to a sanctions list. (ii) Identity verification and screening Before an app can be published in the App Store, a developer must register to enrol as anApple Developer. A developer must sign in with an Apple ID with two-factor authentication,review and accept the latest terms of the Apple Developer Agreement,8 and enter identityinformation. If the developer is enrolling via the Apple Developer app, they are asked to verifytheir identity with a driver’s licence or government-issued photo ID.Trust \& Safety Developer Fraud conducts identity verification and other risk-based checking,in order to identify developers who, it considers, may be unlikely to comply with the AppleDeveloper Agreement (the “ADA”) and DPLA. Apple uses submitted developer data as asecure hash to scan for and block developers attempting to register multiple accounts.The World Wide Developer Relations team conducts a screening intended to preventfraudulent developers from enrolling, including verifying developer identity, enrolmentcountry, and financial information, as well as automated checks against existing andterminated developer accounts to ensure that bad actors (that is to say, developers whohave previously committed or show indicators of intending to commit serious breaches ofthe ADA, DPLA or App Review Guidelines) and associates do not re-enter the program.If a developer passes this round of screening, they can then execute the DPLA, and beginthe multi-step process of submitting an app for distribution on the App Store. (iii) Trader Traceability Pursuant to Article 30(1) of the DSA, since February 2024 Apple also obtains informationfrom developers who specify that they meet the definition of a “trader”, including (a) thedeveloper’s name, address, telephone number and email address; (b) identificationdocuments; (c) payment account details; (d) registration information; and (e) self- 8 “Trusted Flaggers” are organisations designated under Article 19 of the DSA, which have particularexpertise and competence for the purposes of detecting, identifying and notifying illegal content. 5 Non-Confidential Version certification by the developer committing to only offer products or services that comply withapplicable EU law. This detail is published on the trader’s app product page.General review practices (i) App Review Guidelines The Guidelines are the cornerstone of the App Review process. The preamble to theGuidelines notes that the guiding principle of the App Store is to provide a safe experiencefor users to get apps and a great opportunity for all developers to be successful. The AppReview team evaluates all new apps and app updates to ensure compliance with theGuidelines.Through application and enforcement of the Guidelines, the App Store aims to limit potentialrisks, including the Systemic Risks within its control. While Apple is unable to monitor orprevent content hosted within third-party apps, the Guidelines provide detailed,comprehensive and relevant requirements regarding developer’s own risk mitigationresponsibilities.Particularly relevant to the DSA are Guidelines that: a) Prohibit objectionable content;b) Contain specific rules for apps with UGC;c) Contain specific rules for apps in the Kids category;d) Require developers to set appropriate age ratings; ande) Require compliance with privacy, intellectual property, consumer protection and allother applicable laws, including the U.S. Federal Children’s Online PrivacyProtection Rule (“COPPA”) and GDPR.Below are summaries of some of these important Guidelines that play an important role inthe App Store’s risk mitigation measures. (ii) Section 1: Specific app review practices for “Safety” Section 1 of the Guidelines on Safety states that users expect to feel safe in installing an appfrom the App Store, and need to have confidence that the app will not contain upsetting oroffensive content, damage their device, or cause physical harm. In 2022, 92,598 apps were rejected for non-compliance with Section 1 of the Guidelines . 9In 2023, 103,629 apps were rejected for non-compliance with this section.(A) Objectionable content Section 1.1 (Objectionable content) states that “Apps should not include content that isoffensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste.” Amongother things, this section prohibits apps that contain:a) defamatory, discriminatory, or mean-spirited content;b) portrayals of people being killed, tortured, or abused; 9 App submissions may be rejected for non-compliance with one or more Guidelines. Non-Confidential Version 6 c) content that encourages violence, or illegal or reckless use of weapons;d) overtly sexual or pornographic material. This includes apps that may includepornography or be used to facilitate prostitution, or human trafficking andexploitation; ore) harmful concepts which capitalise on current events.(B) User-generated content Section 1.2 (User-generated content) states that apps with UGC present particularchallenges, ranging from intellectual property infringement to anonymous bullying. Toprevent abuse, apps with UGC or social networking services must include:a) a method for filtering objectionable material from being posted to the app;b) a mechanism to report offensive content and timely responses to concerns;c) the ability to block abusive users from the service; andd) published developer contact information.Section 1.2 also provides that apps with UGC or services that end up being used primarilyfor pornographic content, Chatroulette-style experiences, objectification of real people (forexample “hot-or-not” voting), making physical threats, or bullying do not belong on the AppStore and may be removed without notice. (C) Kids category 10 Section 1.3 (Kids category) provides that apps in the “Kids” category must not include linksout of the app, purchasing opportunities, or other distractions to kids unless reserved for adesignated area behind a “parental gate”. 11 In addition to complying with privacy lawsapplicable to children, Kids Category apps may not send personally identifiable informationor device information to third parties and should not include third-party analytics or third-party advertising. In limited cases, third-party analytics may be permitted provided that theservices do not collect or transmit any identifiable information about children (such asname, date of birth, or email address), their location, or their devices. Any third-partycontextual advertising services in Kids category apps must have publicly documentedpractices and policies for Kids Category apps that include human review of ad content forage appropriateness (and a link must be provided to such policies and practices when theapp is submitted for App Review). (D) Physical harm Section 1.4 (Physical harm) warns that apps that present risks of physical harm may berejected and, for example, prohibits apps that encourage: 10 The Kids category on the App Store are apps specifically designed for children ages 11 and under.Developers places their apps in one of three age bands based on its primary audience: 5 and under, 6 to 8,or 9 to 11.11 A parental gate presents an adult-level task that must be completed in order to continue. The App Storeprovides developers with guidance regarding the creation of parental gates here:https://developer.apple.com/app-store/kids-apps/ Non-Confidential Version 7 a) Consumption of tobacco and vape products, illegal drugs, or excessive amounts ofalcohol;b) Drink-driving or other reckless behavior, such as excessive speed; orc) Use of devices in a way that risks physical harm to users or others. (iii) Section 2: Specific app review practices for “Performance” Section 2.3 requires developers to ensure that all app metadata, including privacyinformation, their app description, screenshots, and previews accurately reflect the app’score experience.Section 2.3.8 requires all app metadata, including apps and in-app purchase icons,screenshots, and previews to adhere to a 4+ age rating, even if the app is rated higher. Byway of example, even if a developer’s game includes violence, images on the App S toreshould not depict a gruesome death or a gun pointed at a specific character. (iv) Section 5: Specific app review practices for “Legal” Section 5 of the Guidelines states that apps must comply with all legal requirements in anylocation where developers make them available, and specifies that the developer isresponsible for understanding and ensuring their app conforms with all local laws, includingbut not limited to intellectual property laws. In addition, Section 5 states that apps thatsolicit, promote, or encourage criminal or clearly reckless behaviour are unacceptable, andwarns that in extreme cases, such as apps that are found to facilitate human trafficking and/ or the exploitation of children, the appropriate authorities will be notified.In 2022, 441,972 apps / app updates were rejected for non-compliance with Section 5 of theGuidelines. In 2023, 420,914 apps were rejected for non-compliance with this section. (A) Privacy Section 5.1 (Privacy) states that protecting user privacy is paramount in the Appleecosystem, and developers must be careful when handling personal data to ensurecompliance with, among other things, privacy best practices, applicable laws, the terms ofthe DPLA, and customer expectations. (B) Data practices Section 5.1.1 (Data Collection \& Storage) provides that all apps must:a) include a link to their privacy policy, which must comply with Section 5.1, in an easilyaccessible manner;b) secure user consent for the collection of user or usage data;c) provide an easily accessible and understandable way to withdraw consent;d) only request access to data relevant to the core functionality of the app;e) respect user permission settings; Non-Confidential Version 8 f) allow app use without a login if the app does not rely on account-based features; andg) not compile personal information without the user’s explicit consent.Section 5.1.2 (Data Use \& Sharing) further requires that, unless explicitly permitted by law,all apps must:a) not use, transmit, or share someone’s personal data without first obtaining theirpermission;b) obtain explicit permission via the App Tracking Transparency APIs to track theiractivity;c) not repurpose data collected for a different purpose without additional user consent;andd) not attempt to secretly build a user profile based on collected data. (C) Health Section 5.1.3 (Health and Health Research) states that health, fitness, and medical data areespecially sensitive and sets out additional rules for apps with such a focus. (D) Kids Section 5.1.4 (Kids) has additional privacy and data requirements for children:a) apps must comply with all children data protection laws (for example, COPPA andGDPR);b) apps should not include third-party analytics / advertising if intended for kids;c) use of terms like “For Kids” and “For Children” is reserved for the Kids Category; andd) apps not in the Kids Category cannot imply the app is for children.(E) Location services Section 5.1.5 (Location Services) provides that use of location services in an app is onlyappropriate if:a) directly relevant to the features and services provided by the app;b) the purpose of location services has been explained to the user; andc) the user has been notified and provided consent before the collection, transmission,or use of any location data. (F) Intellectual property Section 5.2 (Intellectual Property) requires developers to only include content in their app ifthey own it or are licensed or otherwise have permission to use it, and directs developerswho believe that their intellectual property rights have been infringed by another developer 9 Non-Confidential Version on the App Store to submit a claim via the App Store Content Dispute web form.12 If the appfeatures third-party trademarks or copyrighted content or lets users stream or downloadthird-party content, the developer must provide with its app submission its authorisation touse such content.13 (G) Gaming, Gambling and Lotteries Section 5.3 (Gaming, Gambling, and Lotteries) states that developers must fully vet theirlegal obligations everywhere their app is available. Among other requirements, apps used inconnection with real money gaming or lotteries: a) cannot use in-app purchase to purchase credit or currency;b) must have necessary licensing and permissions where the app is used;c) must be geo-restricted to those locations; andd) must be free on the App Store. (H) Developer Code of Conduct Section 5.6 contains the Developer Code of Conduct. It requires developers to treateveryone with respect, including in responses to App Store reviews, customer supportrequests and dealings with Apple. The Code of Conduct prohibits harassment,discriminatory practices, intimidation, and bullying. Repeated manipulative, misleading, orfraudulent behaviour will result in removal from the Apple Developer Program. It furtherstates that apps should never attempt to “rip off” customers, trick them into makingunwanted purchases, force them to share unnecessary data, or engage in manipulativepractices within or outside of the app. The Code of Conduct Section also states that:a) developer and app information must be truthful, relevant, and current;b) manipulating the customer experience (for example, charts, search, reviews, or appreferrals) is not permitted; andc) indications that customer expectations are not being met (for example, excessivecustomer complaints, negative reviews, and excessive refund requests) may resultin termination.App review escalations and new and emerging issues During the App Review process, app reviewers may escalate issues to App Review specialistteams or other functional groups, as needed, to provide input, to work with developers oncompliance issues, or to take action against problematic apps. New and emerging issuesare often escalated in order to seek guidance on the appropriate path forward, including forexample in response to specific events, [CONFIDENTIAL], or new technologies, [CONFIDENTIAL]. Below are the key groups involved in app escalations. 12 13 https://www.apple.com/legal/intellectual-property/dispute-forms/app-store/ https://developer.apple.com/app-store/review/ 10 Non-Confidential Version (i) App Review Compliance This team tracks trends of misleading app concepts and signals, as well as app spam issues.An app reviewer may escalate an app to this team to investigate app behaviour, includingwhether behaviour has changed since an initial review, to determine whether the appexhibits fraudulent or misleading functionality, or to determine whether developer-hostedcontent violates the Guidelines. If there is a problem, this team will work with the developerto bring the app into compliance or remove the app from the App Store, if appropriate. (ii) App Store Improvements/Technical Investigations If an app reviewer identifies a need for a deeper analysis of the technical functionality of anapp, they will escalate the issue to Technical Investigations. For example, this teaminvestigates whether an app uses private APIs that may violate the Guidelines’ privacy anddata collection requirements. Based on the results of a Technical Investigation, the appreviewer may reject the app. Additionally, learnings collected during these investigations areapplied to help develop and refine automated review tools, to determine if existing andfuture app submissions contain similar issues. (iii) App Review Policy If an app presents a new or unique issue that requires policy or Guideline interpretation, anapp reviewer will escalate that issue to the App Review Policy team. This team investigatesnovel apps, evolving technologies, and current trends in apps, as well as highly sensitive andlegal issues. This team regularly works with and seeks advice from other functional groups, [CONFIDENTIAL]. The App Review Policy team meets on a weekly basis and as needed toconsider app policy escalations. The App Review Policy team drives the evolution of AppReview’s policy enforcement efforts and informs the ongoing development of internalpolicies and updates to the Guidelines. (iv) Legal, privacy, government affairs, child safety, global securityinvestigations \& regional experts As explained above, the App Review teams are educated on potential legal issues and risks,including on topics such as CSAM, illegal content, suppression of human rights, andmisleading public health information. On a daily basis, App Review escalates app issues tosenior management in App Review and the App Store Legal team. The App Store Legal teamprovides legal advice and coordinates with various other internal legal and regulatory teams(including EU-based teams) across Apple (for example, Privacy Compliance, Privacy Legal,EU Regulatory Legal, Human Rights, Child Safety, Global Security), as well as externalcounsel, for input and advice on complex issues presented by apps. (v) ERB 11 Non-Confidential Version The ERB is composed of senior leaders who have ultimate decision-making responsibilityregarding access for apps to the App Store. The ERB meets regularly and receives updatesand management information from various App Store functions, including App Review andApp Store Legal. These updates detail information regarding App Review processing timesand approval/rejection information, and new and emerging issues, including new and noveltypes of apps.Where escalation issues cannot be resolved by the App Review team or the App Store Legalteam, they are escalated to ERB. The ERB will then decide next steps, including apptakedowns, further engagement, or an exploration of viable alternatives, as appropriate. App review rejections, suspensions, terminations, appeals The underlying philosophy of the App Review team is to work with developers to ensure appsare compliant with the Guidelines, as well as local legal and regulatory requirements.If an app under review is in violation of the Guidelines, the team may reach out to thedeveloper to work with them on remediation, unless for example the app is clearlyfraudulent. If the app is rejected, the developer receives a message describing the reasonsfor an app rejection. The message identifies the Guideline that the app violates, describesthe ways in which the guideline has been violated and provides next steps to help resolvethe rejection, including access to additional resources. Developers may also request a callto discuss issues with an App Review specialist.The App Review team may, depending on the severity of the issue, afford the developer 14to 30 days to rectify an objectionable content issue (for example, by content-takedowns, oruser blocking) before removing the app or taking additional measures. They may also requirethe developer to update their content moderation plan and confirm mitigation measures arein place to avoid recurring issues.Developers can respond to the reviewer with a request for additional information or furtherdiscussion of the issues, or may dispute the findings.App removals and developer terminations are the most severe measure to be undertaken incircumstances where remediation attempts have failed or are not an option, such as incircumstances where the app is fraudulent, or facilitates illegal activity.As explained in the “After You Submit” section of the Guidelines, developers can disputedecisions of App Review regarding app rejections or developer terminations, via an appealsprocess, which is overseen by the App Review Board (the “ARB”).14 The ARB is composed ofexperienced App Review specialists who investigate claims asserted in an appeal, thehistory of the app and interactions with the developer, and seek input from specialisedfunctions where appropriate.Very few appeals are sustained, which tends to confirm the robust nature of app removaland developer termination decisions. For example, in 2022, Apple removed 186,195 appsfrom the App Store. Only 18,412 of those decisions were appealed, and 616 resulted in theapp being restored.15 Similarly, 428,487 developer accounts were terminated. Only 3,338 14 https://developer.apple.com/app-store/review/ - see “Appeals”. This page includes a link to a form fordevelopers to submit appeals.15 As noted in the 2022 Transparency Report, most app removals that are appealed are removed from the AppStore due to illegality or fraud. Consequently, most appeals from developers of such apps are rejected. 12 Non-Confidential Version developer account terminations were appealed and, o f those, 159 resulted in arestoration.16 In 2023, Apple removed 116,117 apps from the App Store. 18,628 of theremovals were appealed, and 322 resulted in the app being restored. During the sameperiod, Apple terminated 117,843 developer accounts. 4,737 of the developer accountterminations were appealed; of those, only 126 resulted in the restoration of the developeraccount.17 Ongoing monitoringThe App Review process does not stop once an app is approved and published on the AppStore. This is necessary for a number of reasons: a) Initial automated and human review cannot be expected to have a 100% successrate. Problematic app developers go to great effort to hide malicious functionality intheir apps. As a result, sometimes malicious apps are published on the App Store,despite Apple’s extensive risk mitigation measures.b) Many apps contain content that changes over time. Developers of fraudulent appssometimes introduce a switching mechanism that makes the app appear benign (likea simple game) during initial review but contains a trigger that can be switched post-approval to serve illicit or fraudulent content (i.e. “bait-and- switch”). In 2022, Appleblocked or removed 23,823 apps for bait-and-switch tactics. In 2023, Apple removedor rejected 40,000 apps from developers who engaged in bait-and-switch activity.c) An approved app may also be found to have misrepresented its privacy policies andbe illegally using personal information. An app might also evolve into a threat notinherent to its design. For example, a simple message board app that appearsharmless on its face during App Review might later be used for illegal purposes.Ongoing App Review through automated scans and other threat detection tools address theimpact of a threat discovered post-approval. These tools help ensure that Apple can identifythe developer, track malicious patterns by the same developer, identify similar patternspresented by other apps, and cut off distribution at a single source. Apple can directlycommunicate with the app developer and rapidly remove the app from the App Store ifnecessary.Automated and human-based app reviewThe App Review process applies to both new apps and to updates to existing apps (forexample, when an app introduces a new version, adds new features, extends to newplatforms, or uses an additional Apple technology).Every app or app update provided to the App Store for distribution is uploaded through AppStore Connect, which is a developer tool where developers upload, submit, and managetheir apps. Upon submission, the developer creates an app record, provides app metadata,along with the app name and description and other relevant information. 18 A complete setof metadata must be provided (i.e. if a submission includes “placeholder” text, it will be 16 17 https://www.apple.com/legal/more-resources/docs/2022-App-Store-Transparency-Report.pdfhttps://www.apple.com/legal/more-resources/docs/2023-App-Store-Transparency-Report.pdf 18 https://developer.apple.com/support/terms/ 13 Non-Confidential Version rejected). Every app or app update submission is then reviewed by the App Review team,first via automated means and then by human app reviewersAutomated reviewThe App Review automated process includes static binary analysis, asset analysis, andruntime analysis via automated on-device install, launch, and exploration tests. The aim ofthese automated processes is to efficiently gather information that can be interpreted bymachine learning algorithms and analysed for threats and signals (for example, thepresence of malicious URLs or executable code) that provide relevant app information tothe human review component. The automated review process also conducts checks [CONFIDENTIAL], and cross-references apps and developers against previouslyidentified threats in the App Store ecosystem to better detect malicious actors, fraud,and other abuses.For over a decade, using proprietary machine learning tools and technologies, the App Storehas developed an internal corpus of information used to mitigate risks, such as previouslyidentified threats, identified malicious apps and developers, suspicious keywords,malicious IP addresses and URLs. For example, malicious URL detection involves analysingURLs that have been previously flagged for illegal or harmful content or characteristics. Byanalysing information in new app submissions for similarities with previously identifiedinformation, the automated review component of the App Review process helps keep badapps and actors from entering or re-entering the App Store.Similarly, automated review interprets cached text and images, [CONFIDENTIAL], andidentifies potential threats like executable code, which could be used to change appfeatures or functionality after app review and approval.The information gathered during automated review flags potential risks and provides usefulsignals and information to human app reviewers to evaluate in more detail. [CONFIDENTIAL] Finally, as explained in more detail below, automated processescontinue after approval of apps that are available on the App Store, with automateddetection and escalation mechanisms continuing to scan for potential threats.Automated review capabilities are continually assessed for their performance andimproved. The App Review team works with engineering teams and domain experts acrossApple to identify trends flagged by human app reviewers, investigate spikes in reportsrelating to specific issues (e.g. via Report a Problem), assess novel threats, and theapplicability of both established and emerging technologies to mitigate these threats.Multiple improvement efforts have historically been introduced each year.There are more than 100,000 app submissions in an average week. In 2022, App Reviewreviewed 6,101,913 submissions (including app updates), of which over 25 % were rejectedby the App Review team for various compliance issues.20 In 2023, App Review reviewed6,892,500 submissions (including app updates); again, over 25 % were rejected. 21 App 19 [CONFIDENTIAL] 20 https://www.apple.com/legal/more-resources/docs/2022-App-Store-Transparency-Report.pdf21 https://www.apple.com/legal/more-resources/docs/2023-App-Store-Transparency-Report.pdf 14 Non-Confidential Version Review therefore serves an important function in mitigating risks, including potentialSystemic Risks, in the App Store.Human reviewEvery app and every app update undergo human review. During human review, appreviewers analyse the signals provided by automated systems and review the features andfunctionality of apps to ensure they are compatible with the App Store’s systems andproducts, comply with the Guidelines, and do not give signs of potential deceptive, abusive,or otherwise harmful behaviour. If a reviewer detects a potential Guideline violation, theyengage with the developer, reject the app or further escalate issues to specialists within theApp Review team or to other functional groups, such as the App Store Legal team. If thereare no Guideline violations, the app may be approved for publication in the App Store. [CONFIDENTIAL] Human Review builds on and complements automated review, since human app reviewersare often better positioned than automated tools to identify apps that risk physical harm,apps which are unreliable, or apps which otherwise pose concerns in ways that are notreadily apparent to automated (static and dynamic) tools. As regards safeguarding user dataand privacy, [CONFIDENTIAL] a human app reviewer is trained to assess [CONFIDENTIAL] are appropriate for the app’s functionality. For example, a human app reviewer will likelydecide that a calculator app does not need to request access to data and functionality likephotos or the microphone. Similarly, app reviewers are trained to evaluate whether anapp age rating is appropriate given the app’s content and functionality, as well as whetherapps with user-generated content have sufficient content moderation mechanismsto protect children or mitigate risks related to offensive content, harmful concepts, orpublic security.The App Store review process is carried out by over 500 human app review experts, includingover 170 individuals based in the EU, representing 81 languages across three time zones.Prior to reviewing any apps, new employees receive four to six weeks of intensive trainingregarding, inter alia, all components of the Guidelines, including screening for privacy anddata issues, particularly for children; objectionable content; apps with user-generatedcontent; and legal considerations.The App Review teams are educated on potential legal issues and risks – including highlysensitive topics such as CSAM, real money gambling, illegal content, suppression of humanrights, and misleading public health information – and the appropriate escalation paths.Apps are assigned to individuals for review based on their skills, qualifications andexperience, including language capabilities, cultural sensitivities, and specialised training.After initial training, new App Review personnel work is monitored and audited, and theyreceive regular performance feedback and specialised training, as appropriate. All appreviewers have ongoing support and internal resources, such as mentoring, coaching,access to app review processes and policies, and weekly and ad hoc meetings withmanagers. The work of human reviewers is audited and new and emerging issues feed intoguidance updates and learning resources. The App Review team also monitors customer 15 Non-Confidential Version and developer feedback to assess performance. Additionally, the App Review BusinessExcellence team performs quality control and audit to conduct root-cause analysis andmake necessary improvements, whether to tools or performance management ofreviewers.The diverse App Review team tracks evolving risks in the EU and around the world, based ontrends, language cues, global events, and other signals, all of which is used to continuallyupdate and train the automated and human review functions. App reviewers are kept up todate regarding new and evolving risks via coaching, access to practices and policies, andmeetings referred to above.When App Review discovers apps that contain illegal content, fraudulent or maliciouscontent or behaviour, it adjusts the review process to prevent such apps from beingapproved in the future. If Apple discovers apps that have not sought to circumvent the AppStore review process per se but that are exhibiting malicious or user-unfriendly behavioursafter installation, Apple similarly adjusts its processes to prevent this from reoccurring. IfApple discovers new malware on its platforms, it adjusts its custom-written malwarescanners to scan apps already on the App Store and detect such malware in the future.Post-publication reviewThe App Review process continues even after an app is first published on the App Store.Developers are required to submit updates to their apps to the App Review team. Thisensures that Apple’s App Review function reviews apps throughout their entire lifecycle, andcan identify new features and functionality that may not comply with the Guidelines.Furthermore, the App Store takes action against apps that exhibit malicious or otherproblematic behaviours after they have become available in the App Store. The App Storehas a number of automated tools in place to detect malware on existing apps, that it runs atperiodic intervals to capture content at different times. This includes tools to identify “bait-and-switch” apps, where apps available on the App Store change or add new functionalityafter approval by the App Review team. Once flagged by automation, these apps are re-reviewed by human app reviewers to evaluate whether intervention is needed.App Store User-Generated Content MeasuresThe only UGC on the App Store is user-generated app ratings and reviews, which are subjectto content moderation by the Trust and Safety Operations team who also moderatedevelopers’ responses to reviews. The Trust and Safety Operations team takes bothpreventative and responsive steps by way of mitigation of risks arising from UGC, whichinclude the publication of false, illegal or harmful content, or fraudulent conduct that isdesigned to manipulate an app’s rating (“Rating and Review” fraud). Without ratings andreviews moderation, misleading and fraudulent information would be spread on the AppStore, which could lead users to download malicious apps.A number of key process mitigations apply to user submission or ratings and reviews. Inparticular, ratings and reviews can only be submitted by registered users who havedownloaded the relevant app. Furthermore, all user ratings and reviews are subject to apublication delay before being published on the App Store.A number of monitoring processes are carried out to protect against fake or fraudulentreviews, including scanning for spam, profanity and foul language, and multiple duplicateor similar entries. Furthermore, Apple has a number of systemic block and monitoring 16 Non-Confidential Version processes to moderate user ratings and reviews and developer responses, and takes actionagainst users and developers who do not comply with applicable ratings and reviews termsand conditions.Reviews can be sorted by helpfulness, rating, or recency. When ordering reviews byhelpfulness, Apple considers the review’s source, quality, thoroughness, and timeliness aswell as how other customers have engaged with the review.The Trust and Safety Operations team also reacts when it is alerted to potentiallyproblematic ratings and reviews, or developer responses, via “Report a Concern”. Thisfunctionality and related process is described in further detail below.The Trust and Safety Operations team works with a variety of partner teams, includingAppleCare, to continually improve the automated processes that flag and block fake orfraudulent reviews prior to publication, and the post-publication review and escalationprocedures.When the App Store is alerted to a concern about a rating or review, it investigates and mayremove a review or developer response, and / or disable the ability to review from a useraccount. In certain cases, ratings and reviews are escalated for further investigation, forexample in cases where a reported concern contains malicious activity that infers bodilyharm, or child safety and / or child exploitation concerns. Reviews that contain informationconcerning a criminal offense involving a threat to life or safety will also be escalated and ifnecessary reported to law enforcement, in accordance with Article 18 of the DSA.In 2022, App Store processed over 1 billion ratings and reviews, of which more than 147million were blocked and removed for failing to meet its moderation standards. In 2023, AppStore processed over 1.1 billion ratings and reviews. Close to 152 million were removed.Review and Controls Associated with Recommender SystemsAs explained above, users can discover apps available in the App Store through five tabs:Today, Games, Apps, Arcade, and Search. The apps that are displayed in these tabs appearorganically (for example, various categories of “Top” charts) in all tabs except Search; as“recommendations” in the form of algorithmically selected recommendations or editoriallycurated recommendations in all tabs; as a search result in the Search tab; or as an AppleSearch Ad in the Today or Search tabs. App recommendations may also be personalisedbased on a user’s demographic, as well as App Store purchase and download history.Notably, all apps appearing in the App Store, including those which are recommended, havealready undergone the rigour of the App Review process and have been approved forpublication in the App Store. [CONFIDENTIAL] Non-Confidential Version 17 Algorithmically Selected App Recommendations Apple maintains an app repository that describes various attributes of apps during theirlifecycle in the App Store. For example, the app repository includes standard appinformation and metadata supplied by the developer, such as the name of the app anddeveloper, when the app was released, the app categories, and the app’s age rating. It alsoincludes information about the app’s popularity, including statistics on app downloads andtransactions; aggregate and anonymised user engagement signals, such as browse andsearch activity; and fraud trust signals. [CONFIDENTIAL] Whether an app appears in recommendations depends on machine learning algorithms thatinterpret information from the app repository related to: (i) app quality; (ii) app popularity;(iii) app sensitivities; and (iv) the context of the recommendation.Not all apps may appear as recommendations. [CONFIDENTIAL] For example, if the AppStore becomes aware of violations of the Guidelines, the app may be removed fromrecommendations until the app becomes compliant. [CONFIDENTIAL] Editorially Curated App Recommendations 18 Non-Confidential Version The App Store Editorial team uses apps from the app repository to curate its own unique apprecommendations. Factors that App Store editors consider when consideringrecommendations include: (i) user interface design: the usability, appeal, and overall quality ofthe app; (ii) user experience: the efficiency and functionality of the app; (iii) innovation: appsthat solve a unique problem for customers; (iv) localisations: high quality and relevant; (v) accessibility: well-integrated features; (vi) Ap p Store product page: compelling screenshots, app previews, and descriptions; and (vii) uniqueness.For games, editors also consider: (i) gameplay and level of engagement; (ii) graphics andperformance; (iii) audio; (iv) narrative and story depth; (v) ability to replay; and (vi) gameplaycontrols.The Editorial team creates a curated catalogue of apps for each category used in the varioustabs (for example, original stories, tips, how-to guides, interviews, App of the Day, a Game ofthe Day, Now Trending, Collections, Our Favorites, Get Started). For each curatedcategory, the Editorial team determines whether to pin certain categories in designatedvertical positions of tabs. They can also choose to personalise categories, as describedbelow. If a story has been personalised, the curated category would surface and orderstories that are most relevant based on a user’s purchase and download history. [CONFIDENTIAL] The curation guidelines have been distilled into best practices, which arepublicly available to help developers understand what the App Store finds valuable incuration for users.22 App Store Search Results functionWithin the Search tab, users can use the “search” function to search for games, apps andStories. This search function is designed to help users find the apps they are looking for asefficiently as possible.Users can search in one of the 40 languages available on the App Store. When a user startstyping a search word they are presented with a number of suggested terms in a list, beforethey hit the “search” button to action the search. These suggested terms are selected byalgorithm. The dominant factor that determines these suggested terms is based on prioraggregate user search behaviour in the storefront in which the user is searching. This userbehaviour is tracked on an anonymised basis and not per individual user. If there are fewprior searches similar to what a user has started typing, another algorithm will suggest termsbased on app name-matching.When a user clicks on “search” they are presented with search results. These search resultsare unique to the App Store storefront associated with the user’s account. Search resultsare determined by an algorithm, which determines results based on a number of factors,including: a) text relevance (for example using an accurate app title), relevant keywords /metadata, and category of app a user has searched for (for example games); 22 https://developer.apple.com/app-store/discoverability/ Non-Confidential Version 19 b) signals associated with aggregated user behaviour, including app searches anddownloads, number and quality of ratings and reviews and app downloads in thestorefront the user is searching in; andc) date of launch in the App Store.When an app is new and does not have significant numbers of searches or user signalsassociated with it, it is automatically boosted by the search results algorithm. Once the apphas sufficient exposure in the search function, and the algorithm has collected sufficientsignals regarding its popularity / quality, the boost is removed.In limited circumstances, Apple may manually override results by removing or adding agiven app listing from the search results. For example, if a developer adds keywords to theirlisting attempting to rank in queries for which they are not relevant, Apple can remove theirresult for that search query.Apple applies the same search algorithm, applying the same factors, to its own apps as itdoes to third-party apps.Search results are not personalised. However, some personalisation of the presentation ofthe results may occur on-device, for example if a user searches for an app that they havealready downloaded to their device. In such instances, the search results may includeproduct information about the already downloaded app in a more condensed form.Apple Search AdsApple Search Ads is a service by which developers can pay for promoted placements of theirapps in the App Store.Within the App Store, Apple Search Ads appear in the Today tab, the Search tab and Searchresults, and in app product pages users access while browsing. These promoted appplacements appear on the App Store itself and are distinct from and unrelated to the third-party advertisements that may be shown within an app, for which the developer, and notApple, is responsible.Apple Search Ads only feature apps already available in the App Store in the subject countryor region.With Apple Search Ads, it is made clear to users that they are seeing a promoted appplacement (as opposed to an editorial / organic placement) through clear and conspicuousvisual cues intended to make a clear distinction between promoted app placement andorganic content. All such promoted app placements include a prominent “Ad” mark, andmay include border and background shading demarcations. Moreover, the “Ad” mark isinteractive; when a user taps on it, they see an “About this Ad” sheet, which explains whythey are seeing that particular app and what criteria, if any, were used to display the relevantapp campaign. If a user clicks on the promoted app, they are taken to the app product page.Apple Search Ads determines which apps get promoted placement via a bid auctionmechanism: advertisers pay only what they are willing to pay in a competitive auctionmarketplace, based on their individual preferences, including bids for actions like taps orinstalls.All developers who promote their apps using Apple Search Ads must contractually committhat their promoted apps will comply with all applicable laws and regulations. 20 Non-Confidential Version Apple takes several measures to address risk relating to Apple-delivered promoted appplacement on the App Store. For example, in addition to the actions performed by the AppReview team to review and approve apps for distribution on the App Store, the Apple SearchAds team additionally reviews promoted app placement for content, imagery, andpromotion category classification. Apple Search Ads policies prohibit certain categories ofapps from being promoted on the App Store – either altogether, in certain c ountries orregions, or in certain App Store placements.23 Moreover, some categories of apps that arenot prohibited may still face promotion restrictions as managed by the Apple Search Adsteam – for example, submitting proof of specific permits or licences to Apple as aprerequisite to advertising, including the promotion of apps, in certain countries or regions.Additionally, the Apple Search Ads team routinely monitors account and advertiser actionsfor signs of potential misconduct and handles complaints relating to Apple Search Adsadvertising.Apple Search Ads is engineered to facilitate promoted app placements in a manner thatensures that the App Store does not know which promotional app has been surfaced to auser, or whether an identifiable user has viewed or clicked on it.Apple creates “segments” to deliver personalised Apple Search Ads on the App Store.Segments are groups of people who share similar characteristics. Information about a usermay be used to determine which segments they are assigned to, and thus, which AppleSearch Ads they receive. To protect user privacy, personalised Apple Search Ads aredelivered only if more than 5,000 people meet the targeting criteria selected by anadvertiser.Information to assign a user to segments is strictly limited and includes account information(for example, name, address, age, gender), downloads, purchases and subscriptionsrecords on the App Store. When selecting which Apple Search Ad to display from multipleads for which a user is eligible, Apple may use some of this information, as well as App Storesearches and browsing activity, to determine which ad is likely to be most relevant. Thisinformation is aggregated across users so that it does not identify any single user.Pursuant to its obligation under Article 39 of the DSA, Apple has created a public onlinerepository of apps promoted as Apple Search Ads.24 The repository sets out informationabout each app presented as an Apple Search Ad to consumers within the EU, includingwhat content was presented where, and when. The repository is designed to contain thisinformation for the period that the Apple Search Ad unit is live, and for one year from thedate of its last impression. For content that is restricted due to alleged illegality, agovernmental order, or incompatibility with applicable terms and conditions, the repositoryis designed to record the restriction as well as the grounds for the restriction. The repositoryis accessible and can be queried through a dedicated website. An API is also available forlarge volume queries.Apple Search Ads is built with strong limitations to protect children and minors: a) For a minor under 18 (or the age of majority in the relevant jurisdiction) who is loggedin with their Apple ID, the Personalised Ads setting is automatically set to “off” andcannot be enabled until the user reaches the age of majority. With Personalised Ads 23 https://searchads.apple.com/policies/24 https://adrepository.apple.com/ Non-Confidential Version 21 set to off, Apple cannot use account information (for example, name, address, age,gender), apps downloads, or in-app purchases and subscriptions, for serving AppleSearch Apple Search Ads in the App Store.b) When a user turns 18 (or the relevant age of majority), the App Store app will displaya prompt to allow the user to choose whether or not to agree to receive personalisedApple Search Ads on the App Store.Furthermore, as explained in Section 2 above, each app has an age rating. These age ratings,and the age of the user, determine whether, and if so, which Apple Search Ads will bedisplayed to users under 18 years of age, subject always to the following limitations:a) Apple Search Ads are not presented to users under the age of 13;b) All apps rated 17+ are not presented to users under 18 as Apple Search Ads; andc) Certain categories of apps, irrespective of age rating, are not presented to usersunder 18 as Apple Search Ads.For users over 18, it is the developer’s responsibility to configure minimum age targeting tolocal law requirements.PersonalisationPersonalised Recommendations are not available for minors, managed accounts andaccounts that have opted out of personalised recommendations.For a child account, i.e. registered via Family Sharing and under 13 (or the minimum age oflawful consent in the relevant jurisdiction in application of Article 8 of the GDPR), the AppleID is not eligible to receive any personalised recommendations in the App Store.Users can change the Personalised Recommendations setting for their Apple ID going to iOSSettings \> [user name], tapping Media \& Purchases, tapping View Account, and then togglingPersonalised Recommendations on or off. Users can also learn more about whichinformation is used to personalise the recommendations made to them (for exampleinformation about purchases, downloads, and other activities in the App Store).If Personalised Recommendations is turned on, user interactions within the App Store maybe used to personalise app recommendations and editorial content. For example, the AppStore Today tab will recommend content that may be of interest to the user based on whatthey have previously searched for, viewed, downloaded, updated, or reviewed in the AppStore. Recommendations are also based on user purchase history, including in-apppurchases, subscriptions, and payment methods together with account information derivedfrom the user’s Apple ID.In addition, personalised recommendations are based on aggregate information about applaunches, installs, and deletions from users who choose to share device analytics withApple, and aggregate information about app ratings.If Personalised Recommendations is turned off, a user will not receive personalisedrecommendations or editorial content. Instead, recommendations from the app repositorywill display apps without reference to the user’s engagement with the App Store.Mitigating potential third-party abuses 22 Non-Confidential Version The Trust and Safety Operations team is responsible for “live moderation” of App Storehosted UGC and protecting App Store discovery features, including charts and search, fromfraudulent behaviour, including the behaviour of “bots”. Inauthentic ratings and reviewsfrom fraudulent or bot accounts can mislead users into downloading an untrustworthy appthat attempts to game the system through misrepresentation.The Trust and Safety Operations team uses a number of automated monitoring tools toidentify suspicious accounts, apps and app-related activity. These systems help detectsuspicious charts and search manipulation. Trust and Safety Operations can take a rangeof steps to protect against suspicious charts and search manipulation, which includesuppressing an app from search for a limited period. They can also take action againstdevelopers who repeatedly manipulate App Store discovery features, up to and includingtermination of developer accounts.The Trust and Safety Operations team evaluates the efficacy of the automated signals itreceives regarding bot accounts and suspicious activity and drives conversations regardingpossible improvements. App Store and PrivacyApp Store \& Privacy NoticeWhen first interacting with the App Store, users are presented with service-specific privacyinformation, in the form of the App Store \& Privacy Notice.25 This ensures that users have aneffective choice and any consent to data use on Apple products is fully informed.Also presented to users at this time is Apple’s Data \& Privacy Icon, which links to moredetailed on-screen information and more detailed service-specific privacy informationregarding the App Store’s privacy practices. This provides users with transparent and easilyaccessible information that details how Apple collects, processes and discloses theirpersonal data.The App Store uses, inter alia, local, on-device processing to enhance its recommendationsand mitigate privacy risks. In addition, using data such as app installs – the App Store cansuggest apps and in-app events that are more relevant to users. These recommendationsystems are described below.The App Store \& Privacy Notice also explains how users can turn off personalisationfeatures. Personalisation is also described in further detail below.When a user uses a payment card in the App Store, Apple may obtain information from thefinancial institution or payment network, and also use it for fraud prevention andverification.Privacy Nutrition LabelsProduct pages in the App Store feature a section that includes summaries prepared bydevelopers of their key privacy practices in a simple, easy‑to‑read label, which informs theuser about the app’s privacy practices before downloading it. These labels show howdevelopers are collecting and using user data, such as a user location, browsing history, andcontacts. 25 https://www.apple.com/privacy/labels/ 23 Non-Confidential Version The same applies to Apple’s own apps.26 Privacy nutrition labels are an innovative and easilyunderstandable feature which makes use of clear language and images/icons to explainhow data is used.App Privacy ReportThe App Privacy Report, accessible via a user’s Settings, records data on device and sensoraccess, app and website network activity, and the most frequently contacted domains in anencrypted form on user devices.27 Via this report, users are able to see how often theirlocation, photos, camera, microphone, and contacts have been accessed by apps duringthe last seven days, and which domains those apps have contacted. Users therefore havefull and easy visibility into the ways apps use the privacy permissions a user has grantedthem, as well as their respective network activity. Together with Privacy Nutrition Labels,this feature provides users with transparent information about how the apps made availableon the App Store treat user privacy.App Tracking Transparency FrameworkIf a developer wants to track a user across apps and websites or access their device’s datafor advertising purposes, they must seek the user’s permission through the App TrackingTransparency Framework. This applies across all apps available on the App Store. Trackingin this instance refers to linking user or device data collected from an app with user or devicedata collected from other companies’ apps, websites, or offline properties for targetedadvertising or advertising measurement purposes. Tracking also refers to sharing user ordevice data with data brokers.An app tracking section in Settings lets users easily see which of their apps have been givenpermission to track, so they can change their preferences and disable apps from asking inthe future.Access Permissions and App SandboxApps may request access to features such as a user’s location, contacts, calendars, orphotos. The App Sandbox protects user data by limiting access to resources requestedthrough entitlements. Users receive a prompt with an explanation the first time an appwants to use this data, allowing them to make an informed decision about grantingpermission. Developers are required to get permission from users, with a simple, clearlyunderstandable, and prominently placed means before tracking them or tracking theirdevices across apps and websites owned by other companies for ad targeting, for admeasurement purposes, or to share data with data brokers. Even if a user grants accessonce, they can change their preferences in Settings at any time. In addition, no app canaccess the microphone or camera without the user’s permission. When an app uses themicrophone or camera, the user’s device displays an indicator to let the user know it is beingused – whether the user is in the app, in another app, or on the Home Screen. In addition,the Control Center on a user’s device shows the user if an app has recently used themicrophone or camera. 26 https://support.apple.com/en-us/HT21295827 https://www.apple.com/newsroom/2023/05/app-store-stopped-more-than-2-billion-in- fraudulent-transactions-in-2022/ 24 Non-Confidential Version The App Sandbox provides protection to system resources and user data by limiting adeveloper’s app’s access to resources requested through entitlements. This creates securesilos to protect the data of end users across the device.Child Safety and Parental Controls Child safety Apple knows that keeping children safe online is imperative and for that reason has createda number of features to help protect children and provide information to parents andguardians to improve children’s safety online. These include: (a) Child Account Creation;(b) Family Sharing;(c) Screen Time; and(d) Ask to Buy.Child Account Creation. “Family Sharing” is an operating system-level feature that isaccessible in the Apple ID section of settings. Using Family Sharing, a family organiser caninvite up to five other family members to join the family group and designate an adult familymember as a parent/guardian. A parent/guardian, as well as the organizer who can also actas a parent/guardian, can create an Apple ID for users under 13 and enable a range ofparental controls to manage their child’s experience during the account creation process. 28The organizer can also enable parental controls when they invite a child between 13-17 tojoin the family group by selecting Invite in Person in family settings. Child users cannotcreate an Apple ID themselves if they indicate that they are under 13 years of age; all suchaccounts must be set up by a parent/guardian or the organizer via Family Sharing.Family Sharing enables the safe use of Apple devices and products by families and childrenand allows parents to share access to Apple services. However, there may be times whenparents want to limit the child’s access to certain types of content or purchases available tothe rest of the Family. As noted above, if a user is below the relevant age then a parent mustcreate the Apple ID for the child.On supported platforms, Screen Time parental controls allow parents to set limits on theirchild’s device and lock changes using a passcode. Screen Time’s App \& Website Activityfeatures help parents better understand and make choices about how much time theirchildren spend using apps and websites. For example, App Limits can be used to set dailytime limits for certain categories of apps, such as social apps and gaming apps. Screen Timereports provide a detailed overview of how much time is spent using apps, visiting websites,and on the device overall, and which apps send the most notifications, which helps parentsmonitor their child’s device use. Downtime lets parents block apps and notifications fromlaunching on Screen Time enabled devices for specific time periods. If downtime isscheduled, parents can use Always Allowed to make exceptions for specific apps, likeeducational or mindfulness apps. 28 The specific age thresholds referenced below may be different for a user depending on which country orregion is associated with their Apple ID. For more information on the age thresholds, visithttps://support.apple.com/en-us/102617. 25 Non-Confidential Version Parents can use Screen Time’s Content \& Privacy Restrictions feature to restrict thedownload of certain types of content, such as apps with specific age ratings, explicit musicand podcasts, and movies and TV shows with specific ratings. This feature can also be usedto fully restrict the downloading of apps via the App Store, and automatically filter websitecontent to limit access to adult content in Safari and other apps on iOS and iPadOS. Further,through Screen Time, parents can remove apps such as FaceTime and Camera from theirchild’s device Home Screen and place restrictions on certain privacy settings, such asLocation Services and Photos, so that their children cannot change those settingsthemselves without entering the Screen Time passcode. Screen Time’s Communication Limits feature allows parents to choose who their childrenare communicating with and when throughout the day, including during downtime, sochildren can always be reachable, whilst providing the knowledge and control to help keepthem safe.Ask to Buy allows parents to approve app downloads and purchases requested by the child,including in-app purchases, on the App Store and content such as a TV show, movie, or bookfrom Apple Media Services. It is enabled by default for any children under 13 and can beenabled for any family member under 18 by the Family Organiser or another parent/guardianin the family group.29 If a child initiates a download or purchase on their device, parents receive a request toapprove it on their own device. If they chose to approve it, the download or purchase will beadded to the child’s account. If they decline, the process stops there (i.e. App Store will notcomplete the download or purchase). Child Safety Apple employs dedicated Child Safety Counsel. Child Safety Counsel works with otherareas of the Apple business (including those specific to the App Store) relevant to childsafety and contribute to policies and procedures to keep children safe when they engagewith Apple products and services. Child Safety Counsel is also responsible for investigatingescalations from within Apple and third parties (including developers and users) relating toCSAM or CSEA material, and, where necessary, reporting issues to law enforcementagencies. App Store External Notice and Action MeasuresAs detailed above, there are multiple proactive controls in the App Store designed to stopproblematic apps being published on the App Store. There are further controls in place thatensure that only a smaller subset of apps are recommended to users, either asrecommended or editorial content, or as Apple Search Ads.In addition, there are also various reactive controls in place, which are designed to ensurethat users, developers, government agencies and others can alert the App Store toproblematic apps that have already been published on the App Store.Report a Problem 29 https://support.apple.com/en-us/HT201089 26 Non-Confidential Version Customers may use the “Report a Problem” feature to submit notices of offensive, illegal,or abusive content concerning apps they have purchased or downloaded. The Report aProblem function is a tool to help users raise concerns to the App Review team and otherteams about content they may encounter on the App Store. Consumer protection is apriority of the App Store, and an area of focus for the App Store Trust and Safety Operationsteam. “Report a Problem” is a cross-functional effort which originated from collaborationbetween Trust and Safety Operations team engineers and product managers, and theircounterparts in the App Review team, and World Wide Developer Relations, to create user-and developer-facing solutions to address common concerns in the App Store.The Report a Problem link is displayed in the quick links at the bottom of the Games andApps tabs, or from the product page of any app a user has purchased or downloaded. Userscan choose from “report a scam or fraud” and “report offensive, abusive, or illegal content”options to submit their concern about content they have purchased or downloaded. Usersare presented with a free text field to describe the issue they are reporting. [CONFIDENTIAL] 27 Non-Confidential Version As detailed below, developers have recourse to various appeal mechanisms in the eventthat they disagree with Apple’s decision to remove apps or terminate developer accounts.Report a ConcernThe Report a Concern tool is another key control which allows users and developers to raiseconcerns regarding the content of specific user reviews, and developer responses to suchreviews. Concerns can be raised in relation to any content where reviews are available.Report a Concern is available to developers in App Store Connect, as well as to developersand u sers on the App Ratings and Review page, where users can press and hold on thereview and Report a Concern will appear in the pop-up menu. The Trust and SafetyOperations team works with AppleCare to review external escalations raised via “Report aConcern”.Report a Concern could be used in the following scenarios: a) Users or developers seeking to flag misleading, offensive, illegal or irrelevantcontent, or content that otherwise violates the Submission Guidelines of the AMSTerms in reviews. All such flagged reviews are subject to moderation.b) Where a developer may have posted offensive, illegal, or misleading responses tocritical reviews.c) Developers are encouraged in the event they see a review that contains offensivematerial, spam, or other content that violates the AMS Terms and Conditions, to usethe Report a Concern option under the review in App Store Connect instead ofresponding to the review.AppleCare reviews Report a Concern escalations, and performs an initial triage for offensivecontent, including illegal content, instances of profanity, solicitation, or spam. Reportedconcerns go into a queue for the AppleCare team, which is trained by Trust and SafetyOperations on identifying user review violations, and actioning concerns, as well asescalating issues to other relevant teams as necessary. The AppleCare team receivesguidance and training on how to consider a reported concern, including investigation,follow-up and escalation paths.Following its consideration, AppleCare can leave the review as -is, remove a review ordeveloper response, and / or disable the ability to review from a user account. If a reportedconcern contains or a threat or reference to suicide, malicious activity that infers bodilyharm, child safety and / or child exploitation concerns, or otherwise indicates a safety issue,the AppleCare team is instructed to send an email to escalate the matter directly to Trustand Safety Operations. The Trust and Safety Operations team will then forward the reviewand its associated data, including reviewer ID and email address, to Apple’s Global SecurityInvestigations team for further action, which may include alerting law enforcement. Applehas updated its processes to reflect the requirements in Article 18 of the DSA. 28 Non-Confidential Version AppleCare continuously monitors new trends among the customer concerns being reportedand escalated. AppleCare partners with a variety of teams, including Trust \& S afetyOperations, to adapt ratings and reviews detection and response measures whereappropriate.Notices Routed to App Store LegalThe App Store Legal team is responsible for reviewing and vetting notices from externalsources that involve issues with apps in the App Store. As noted above, Governmentregulatory authorities send notices to the App Store, including requests for informationabout an app or developer, or demand to take down an app pursuant to local law or courtorder, via a dedicated email inbox. Likewise, local law enforcement authorities send noticesand requests for information to a similar dedicated email inbox as explained above. Inaddition, customers, developers, government authorities or other parties may providenotices to various functions throughout Apple, which are then routed to the App Store Legalteam.The App Store Legal team works with the App Review team, which reviews and investigatesthe app for any issues identified in the government notice. If the App Review team identifiesa Guideline violation, they will employ standard operating procedures to engage thedeveloper and ensure the app is brought into compliance with the Guidelines, or remove theapp and / or terminate the developer, if the circumstances warrant it. If there is a valid legalbasis or government order to remove the app, the App Review team will take appropriateaction and may communicate the issue to the developer, as appropriate. This may includeremoving the app from the local storefront in question, to comply with local law.Content disputesRights holders can submit App Store content disputes via a dedicated webpage.30 Thesesubmissions are routed to the AMS Content Disputes Legal team for consideration.Once the AMS Content Disputes Legal team receives a complete complaint, the teamresponds with a reference number.31 They put the complainant in direct contact with theprovider of the disputed app. If needed, complainants can then correspond with the AMSContent Disputes Legal team directly via email. The parties to the dispute are primarilyresponsible for its resolution.However, in certain cases, including where the parties are unable to resolve the disputebilaterally, the AMS Content Disputes Legal team will intervene. The team does not takeapps down solely on the basis of fraudulent or anti-competitive claims, but instead willconsider a number of factors when deciding whether or not to remove potentially violativeapps from the App Store. These include: a) whether the app or developer has been the subject of other complaints;b) the frequency of such complaints; andc) whether there is reasonable indication that an intellectual property violation hasoccurred. 30 31 https://www.apple.com/legal/intellectual-property/dispute-forms/app-store/ In the event that a party abandons a claim, Apple has automated templates which are sent out asreminders, and if no response is received, the matter will be recorded as having been closed. 29 Non-Confidential Version If there are continued violations by a developer or the developer makes fraudulentmisrepresentations of material facts, the AMS Content Disputes Legal team may have adeveloper’s account terminated.The AMS Content Disputes Legal team addresses and mitigates risks of potentialintellectual property violations on the App Store, and prevents repeat offenders fromaccessing Apple’s services and causing subsequent infringements. The AMS ContentDisputes Legal team has implemented various controls and processes in order to do so.Dedicated contact points for government authorities and agenciesGovernment authorities from law enforcement and various regulatory agencies may sendnotices requesting information or app removals based on alleged or suspected violations oflocal law. Authorities send requests to the App Store to takedown or investigate apps viaemail notice to dedicated email addresses, [CONFIDENTIAL] or, for law enforcementinquiries and notices, lawenforcement@apple.com. These requests are vetted by theApp Store Legal team.Where credible information is received from any source (for example users, developers orlaw enforcement) that a developer is not acting in accordance with the Guidelines or locallaw, Apple will investigate and take appropriate action, which may include removal of theapp from the App Store and removal of the developer from the Apple Developer Program.In addition, if Apple is alerted to information on the App Store that gives rise to a suspicionthat a criminal offence involving a threat to the life or safety of a person or persons has takenplace, is taking place or is likely to take place, as envisaged in Article 18 of the DSA, stepswill be taken to notify the appropriate law enforcement authorities.Content Reports portal for DSAApple enhanced its escalation and reporting mechanisms to adequately capture reportedconcerns relating to Systemic Risks which may stem from the App Store or its use. In thatregard, and in connection with its efforts to comply with Article 16(1) of the DSA, Appleenhanced its Report a Problem feature and created a new Content Reports portal, to enablethird parties in the EU to report illegal content.In August 2023, the Report a Problem flow was updated to achieve integration with the newContent Reports portal. If a user on a storefront in the EU engages Report a Problem in theApp Store, they can select “Report offensive or abusive content” or “Report illegal content”from the menu of options. If they select the former, the user goes through the process flowoutlined above. If they select the latter, they are redirected to the Content Reports portal.The Content Reports Portal can also be accessed directly via the web.The Content Reports portal is a central platform where individuals, including governmentrepresentatives, and in due course “Trusted Flaggers”, can file notices concerning allegedillegal content, from which communications concerning those notices are processed andsent, and in which data is consolidated for later transparency reporting purposes. Anyonein the EU can submit concerns about alleged illegal content via the Content Reports portal,whether or not they have purchased or downloaded the app in question. Members of thepublic can in the EU also use the portal to anonymously file notices concerning CSAMcontent. 30 Non-Confidential Version [CONFIDENTIAL] All remaining notices will undergo manual triage before submission toApp Review. Manual triage will help Apple track and understand the kinds of notices itreceives, [CONFIDENTIAL] and help identify possible misuse and abuse of the system.Once a notice passes through these triage systems, an automatic acknowledgmentcommunication will be sent to the notifier.After undergoing a verification process intended to safeguard the system and preventabuse, government representatives (and in due course trusted flaggers) can submit noticeswhich bypass the triage systems and are processed on an expedited basis. Governmentrepresentatives and trusted flaggers will also receive acknowledgment communicationswhen their notice is submitted to App Review for analysis.The App Review team collaborates with relevant internal teams and partners, including theApp Store Legal team when appropriate, to review, analyse, and action the notices. Once anaction is taken, the Content Reports portal facilitates necessary communications tonotifiers and designated appointees about the actions taken, and when necessary, toimpacted consumers who purchased illegal products or services.If a notifier disagrees with an outcome, they have the option to challenge the decision viahttps://contentreports.apple.com/Complaints. These complaints are received through aseparate section of the Content Reports Portal and are routed to senior App Review analystsfor review. The senior App Review analyst reviews the original notice alongside any newinformation provided by the complainant. These senior App Review analysts partner withrelevant internal teams, including the App Store Legal team where necessary, to evaluatethe complaints. Some matters may be escalated for review by the ERB. Communicationsare sent to complainants as part of this process.In order to meet the DSA transparency reporting obligations, data is collected throughoutthe various steps in the described content reporting flow.DSA Compliance function, website and transparency reportingDSA Compliance functionIn order to meet the requirements of the DSA, Apple established a DSA Compliancefunction, within Apple’s Compliance and Business Conduct Department.The DSA Compliance function is functionally independent from Apple’s operationalfunctions. The Head of DSA Compliance reports directly to the ADI Board on matters relatingto DSA compliance.Pursuant to Article 41(2) of the DSA, the Head of DSA Compliance has ultimateresponsibility for, inter alia: a) cooperating with Comisiún na Meán and the Commission for the purpose of the DSA;b) ensuring that all risks referred to in Article 34 of the DSA are identified and properlyreported on and that reasonable, proportionate and effective risk-mitigationmeasures are taken pursuant to Article 35 of the DSA;c) organising and supervising the activities of the independent audit that ADI willprocure in accordance with Article 37 of the DSA; Non-Confidential Version 31 d) informing and advising relevant Apple management and employees about relevantobligations under the DSA, including planned training on DSA; ande) monitoring Apple’s compliance with its obligations under the DSA. The Head of DSA Compliance is supported in this role on a day-to-day basis by a number oflegal and other functions responsible for work relating to the App Store, including the AppStore Legal team, EU Regulatory Legal, and Privacy Compliance.DSA Information siteApple has created a DSA information site - https://www.apple.com/legal/dsa/ie,which contains: a) the contact details of the DSA Head of Compliance, as the DSA Articles 11 and 12designated point of contact for communications with Member State authorities, theEuropean Commission, the European Board for Digital Services, and developers andusers of the App Store;b) a link to the Content Reports portal;c) a link to the Ads Repository;d) a link to the DSA redress page. This lists redress options for anyone who has filed anArticle 16 Notice via the Content Reports portal and who wants to challenge Apple’sdecision, as well redress options for developers and users who want to challengedecisions Apple has taken. The page will be updated in the future as Article 21 out-of-court settlement bodies are established;e) a link to the average monthly recipients report; andf) links to the DSA Transparency Reports.In due course, the Information Site will also include App Store Risk Assessment reports.DSA Transparency ReportsPursuant to Articles 15, 24, and 42 of the DSA, Apple publishes App Store DSA TransparencyReports every six months, containing information on orders and notices of illegal contentwhich the App Store has received and content moderation measures which the App Storehas taken on its own initiative.