[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"$f4OPJETscLV257MYj9aG3a0fMWiSMb851M_JH7cpq2uQ":3},{"tableOfContents":4,"markDownContent":5,"htmlContent":6,"metaTitle":7,"metaDescription":8,"wordCount":9,"readTime":10,"title":11,"nbDownloads":12,"excerpt":13,"lang":14,"url":15,"intro":8,"featured":16,"state":17,"author":18,"authorId":19,"datePublication":23,"dateCreation":24,"dateUpdate":25,"mainCategory":26,"categories":42,"metaDatas":48,"imageUrl":49,"imageThumbUrls":50,"id":58},true,"Tired of general newsletters that skim over your real concerns? **DastraNews,** offers legal and regulatory monitoring **specifically designed for DPOs, lawyers, and privacy professionals**.\r\n\r\nEach month, we go beyond a simple recap: we select about ten decisions, news, or positions **that have a concrete impact on your missions and organizations**.\r\n\r\n🎯 **Targeted, useful monitoring grounded in the real-world realities of data protection and AI.**\r\n\r\nHere is our selection for **September 2025:**\r\n\r\n## EU consultation on AI Transparency Guidelines\r\n\r\nThe **European Commission** launched a public consultation (deadline: **9 October 2025**) [to develop **guidelines and a code of practice for transparent AI systems**,](https://digital-strategy.ec.europa.eu/en/news/commission-launches-consultation-develop-guidelines-and-code-practice-transparent-ai-systems) drawing from the transparency provisions in the **AI Act**.Under the AI Act, deployers and providers of **generative AI**, **emotion recognition**, **biometric categorization**, and manipulated content systems must disclose to users when they interact with AI, or when content is AI-generated or manipulated. The consultation invites AI developers, public bodies, research groups, civil society, and citizens to contribute views. The transparency obligations are set to apply starting **2 August 2026**.\r\n\r\n**Implications for practitioners:**\r\n\r\n- Organizations must prepare to embed transparency disclosures (AI labels, metadata, explanation) in their systems.\r\n\r\n- Participating in the consultation is a chance to influence how the obligations will be defined and enforced.\r\n\r\n- The upcoming code of practice may become a de facto standard or a benchmark in audits, litigation, and enforcement.\r\n\r\n---\r\n\r\n## EU-U.S data flows: DPF validated by General Court\r\n\r\nOn [**3 September 2025**, the General Court of the European Union dismissed ](https://curia.europa.eu/jcms/upload/docs/application/pdf/2025-09/cp250106en.pdf)the annulment action brought by MP Philippe Latombe against the **EU–U.S. Data Privacy Framework (DPF)**. Key findings of the Court:\r\n\r\n- The Court chose to rule **on the merits**, bypassing debates about Latombe’s standing.\r\n\r\n- The decision validates that the DPF provides a level of protection **“essentially equivalent”** to EU standards, aligning with the principle in GDPR Article 45.\r\n\r\n- The **Data Protection Review Court (DPRC)** as set up under DPF was judged sufficiently independent to satisfy redress requirements.\r\n\r\n- Regarding bulk collection by U.S. intelligence, the Court recognized that ex post review and procedural safeguards under **Executive Order 14086** are consistent with EU jurisprudence on surveillance (Schrems II).\r\n\r\n**What this means for data controllers and processors:**\r\n\r\n- The DPF remains a valid mechanism for EU → U.S. transfers under GDPR’s Chapter V.\r\n\r\n- Nevertheless, organizations should maintain fallback strategies: **SCCs/BCRs, detailed Transfer Impact Assessments (TIAs)**, especially for sensitive flows.\r\n\r\n- Monitor U.S. legal and oversight changes: the Court emphasized that the Commission must continuously evaluate whether U.S. practices remain aligned with the framework.\r\n\r\n- An appeal to the **CJEU** is possible.\r\n\r\n> 👉 For more information, read our article [here](https://www.dastra.eu/en/article/eu-us-data-flows-secured-by-court-ruling/59566).\r\n\r\n---\r\n\r\n## CJEU ruling: pseudonymized data is not always personal\r\n\r\nOn [**4 September 2025**, in case C-413/23 (EDPS v SRB), the Court of Justice of the European Union ](https://curia.europa.eu/juris/document/document.jsf?text=&docid=303863&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=17376403)clarified that **pseudonymized data may, in certain circumstances, be considered as non-personal** by recipients. Key clarifications from the judgment:\r\n\r\n- The Court emphasized that the **recipient’s perspective** matters: if the recipient cannot reasonably re-identify data subjects, pseudonymized data may fall outside the scope of personal data.\r\n\r\n- Identifiability must be assessed based on **realistic technical and organizational means**, not theoretical possibility.\r\n\r\n- The **controller's obligations** (transparency, information to data subjects at time of collection) still apply regardless of how data may later appear to recipients.\r\n\r\n- This marks a shift from the “absolute” view (supported by EDPS/EDPB) toward a **contextual, risk-based approach** to pseudonymization.\r\n\r\n**Practical takeaways for DPOs & legal teams:**\r\n\r\n- Separate the identifying information (keys) and limit access to reinforce non-identifiability.\r\n\r\n- When sharing or transferring data, perform a recipient-level assessment: what means do they have to re-identify?\r\n\r\n> 👉 For more information, read our article [here](https://www.dastra.eu/en/article/cjeu-are-pseudonymised-data-always-personal/59571).\r\n\r\n---\r\n\r\n## CNIL imposes major fines on Google & SHEIN for cookie violations\r\n\r\nOn[ **3 September 2025**, the CNIL announced significant sanctions: ](https://www.cnil.fr/fr/regulation-des-cookies-la-cnil-poursuit-le-plan-daction-initie-en-2019-et-prononce-deux-amendes)**Google received €325 million**, **SHEIN €150 million** for breaches of e-privacy rules concerning cookies and trackers. These fines are part of CNIL’s long-term **action plan launched in 2019** to enforce stricter compliance. Key violations include:\r\n\r\n- Using trackers without valid prior consent (violating consent and information rules)\r\n\r\n- “Cookie walls” (requiring acceptance of trackers for access) deemed acceptable only if the user has a **real choice** and alternatives are balanced and equally accessible.\r\n\r\n- Google was also held in breach for sending advertising emails based on user data within Gmail without proper consent (violating the CPCE, art. L. 34-5)\r\n\r\n**Takeaways for operators & marketers:**\r\n\r\n- Cookie compliance: regulatory attention remains high.\r\n\r\n- Avoid dark patterns, forced consent, or opaque cookie walls.\r\n\r\n- Review your consent, logging, audit trail, and cookie walls carefully. Documentation is key if challenged.\r\n\r\n---\r\n\r\n## Brazil: adequacy decision in the works\r\n\r\nThe **European Commission** has released a[ **preliminary adequacy decision** recognizing that Brazil’s data protection framework (LGPD)](https://commission.europa.eu/document/download/f5aee532-70bf-41b1-a94a-8e294a528f6a_en?filename=Draft%20Adequacy%20Decision%20-%20Brazil%20-%20LGPD%20-%20FINAL%20-%20September%202025.pdf) ensures a level of protection equivalent to EU standards. Once finalized, this will allow **free and secure data flows** between the EU and Brazil without extra safeguards.\r\n\r\n[On the Brazilian side](https://www.gov.br/anpd/pt-br/assuntos/noticias/european-union-releases-preliminary-version-of-adequacy-decision), the **ANPD** is finalizing its own adequacy process to recognize EU law as equivalent. The mutual recognition will strengthen **citizens’ rights**, increase **legal certainty**, simplify **international business operations**, and boost **trade competitiveness**.\r\n\r\nThe process now moves to the **European Data Protection Board** for an opinion, followed by approval from EU member states. If adopted, Brazil will join 16 other jurisdictions (including the UK, Canada, Japan, and South Korea) already deemed adequate.\r\n\r\n## EDPB issues first guidelines on the interplay between the DSA & GDPR\r\n\r\nThe **European Data Protection Board (EDPB)** has published its [**first guidelines** clarifying the relationship between the **Digital Services Act (DSA)** and the **GDPR**](https://www.edpb.europa.eu/system/files/2025-09/edpb_guidelines_202503_interplay-dsa-gdpr_v1_en.pdf). The **DSA**, which governs online platforms and search engines, aims to create a safer digital environment and safeguard fundamental rights. Many of its obligations involve processing personal data, raising overlaps with the **GDPR**.\r\n\r\n### Key Takeaways from the Guidelines\r\n\r\n- **No hierarchy of laws**: The DSA imposes obligations on platforms (e.g., content moderation, diligence duties, algorithmic transparency), but these do not override or replace GDPR obligations. Data protection rules remain fully applicable.\r\n\r\n- **Legal basis & purposes**: Any data processing carried out under the DSA must still rely on a GDPR-compliant legal basis (such as consent or legitimate interest). The DSA does not create a new automatic ground for processing personal data.\r\n\r\n- **Shared responsibilities**: Roles between platforms, hosting providers, intermediaries, and other actors must be clearly defined to determine who is the data controller or processor in different scenarios (moderation, recommender systems, profiling, etc.).\r\n\r\n- **Transparency & information duties**: The DSA’s transparency requirements (explaining algorithms, moderation criteria, reporting obligations) must be coordinated with GDPR information rights (purpose, access, retention).\r\n\r\n- The EDPB highlights the need for **closer cooperation between** Digital Services Coordinators, the European Commission, and Data Protection Authorities to ensure legal certainty for companies and stronger protection of users’ rights.\r\n\r\n### Why it matters\r\n\r\nCompanies cannot assume compliance with one regime ensures compliance with the other. Instead, both must be reconciled through consistent governance, risk management, and user communication. This dual compliance challenge raises the stakes for platforms, which face oversight not only from data protection authorities but also from digital services regulators.\r\n\r\n## **CNIL and Inria strengthen partnership on data protection and algorithm evaluation**\r\n\r\nThe [**CNIL** (French Data Protection Authority) and **Inria** (French national institute for research in digital science) have signed a renewed cooperation agreement ](https://www.cnil.fr/fr/cnil-inria-partenariat-renforce-protection-donnees-evaluation-algorithmes?mkt_tok=MTM4LUVaTS0wNDIAAAGdSgArWzx1re5L4Z35JQYMHymtvmbjBFdfoCoMoFz03SAeWg4dGr2FhjHtyhTwo5thkKST71wmqSdryNG_y11cXwrWl0urlinZqgYfCSAy9yj4Fw)to deepen joint efforts in data protection, privacy and algorithmic evaluation. The partnership builds upon over ten years of collaboration, but now aligns more closely with the evolving regulatory and technological challenges in Europe, notably those posed by artificial intelligence.\r\n\r\nTogether, CNIL and Inria will coordinate research, produce shared tools and guidance, organize training and public outreach, and co-supervise doctoral and postdoctoral work. One focus will be the **new Institut national pour l’évaluation et la sécurité de l’intelligence artificielle (INESIA)**, with partners such as ANSSI and others.\r\n\r\n## **Commission proposes guidance and template for serious AI incidents and launches consultation**\r\n\r\nThe European Commission has published a[ **draft guidance document** and a **reporting template** ](https://digital-strategy.ec.europa.eu/en/consultations/ai-act-commission-issues-draft-guidance-and-reporting-template-serious-ai-incidents-and-seeks?)for serious incidents involving AI systems under the AI Act, and is seeking input through a public consultation.The draft guidance clarifies when an event should be reported as a “serious incident,” what information must be included, and how the reporting process should operate. The template aims to harmonize the format and level of detail across Member States to ensure consistent incident handling and oversight.The initiative underscores the Commission’s effort to give operational clarity to the AI Act’s obligations on incident reporting and monitoring. The consultation is open until a specified deadline.\r\n\r\n## Austrian court clarifies Article 22 GDPR in AMAS algorithm case\r\n\r\nThe[ Austrian Federal Administrative Court has overturned a ban on the AMAS algorithm](https://www.ris.bka.gv.at/Dokument.wxe?ResultFunctionToken=fbc4b935-fe77-4c28-8771-0079a0d41487&Position=1&SkipToDocumentPage=True&Abfrage=Bvwg&Entscheidungsart=Undefined&SucheNachRechtssatz=), used by Austria’s public employment service (AMS) to assess jobseekers’ labour market prospects. The court found a valid legal basis in the Labour Market Service Act, which specifies what data can be processed and for what purposes, thereby meeting GDPR Articles 6 and 9 requirements. Crucially, the judges held that AMAS did not amount to prohibited automated decision-making under Article 22 GDPR:\r\n\r\n- The court drew a clear distinction from the CJEU’s **SCHUFA** judgment (C-634/21). It acknowledged that AMAS carries out profiling and that its categorisation amounts to a form of “decision.”\r\n- However, Article 22 applies only where decisions are made *solely* by automated means and have legal or similarly significant effects.\r\n- In this case, AMS counsellors played a substantive role rather than a purely formal one. They were required to review and discuss the algorithmic output with jobseekers, take into account additional personal circumstances, correct the algorithm’s outcome where necessary, and ultimately make the final classification themselves.\r\n\r\n## The Data Act goes live\r\n\r\nOn **12 September 2025**, the [EU’s **Data Act (Regulation 2023/2854)**](https://eur-lex.europa.eu/eli/reg/2023/2854/oj/eng) transitions from a future framework into reality. While it had entered into force in January 2024, the regulatory obligations now become operational and enforceable.\r\n\r\nUnder the Data Act, any data generated by a connected product or a related service must be accessible by the user. Access must be timely, free of charge, in structured, machine-readable formats, and in real time when technically feasible.\r\n\r\nThe new regulation also embeds obligations on **contractual transparency**: sellers or lessors of connected products must inform users — prior to contract — of what data will be generated, how it will be stored, and when and how it can be accessed. In B2B and B2G contexts, data sharing must occur on **fair, reasonable, and non-discriminatory terms**.\r\n\r\nMoreover, the Data Act includes rules to promote competition via **cloud switching**: providers must remove technical and contractual barriers. From 12 January 2027, migration fees will no longer be allowed, and before then any fees must be limited to the provider’s internal costs.\r\n\r\n> 👉 For more information, read our article [here](https://www.dastra.eu/en/article/the-data-act-goes-live-what-now/59584).\r\n\r\n## CNIL fines La Samaritaine for hidden cameras in employee Areas\r\n\r\nOn **18 September 2025**, [the **CNIL** imposed a **€100,000 fine** on **Samaritaine SAS** ](https://www.cnil.fr/fr/cameras-dissimulees-la-cnil-sanctionne-la-samaritaine?)for having hidden surveillance cameras in two storage rooms of its store. The cameras had the appearance of smoke detectors and recorded audio, effectively spying on staff. The device was installed in August 2023, discovered by employees, then removed in September 2023.\r\n\r\n### Key findings and violations\r\n\r\n- Samaritaine failed to carry out any **prior analysis of GDPR compatibility** or document the exceptional nature of hidden cameras.\r\n\r\n- The cameras were not declared in the processing register or impact assessments, and the DPO was informed only after installation.\r\n\r\n- The audio recording was deemed excessive, breaching the principle of data minimization (article 5.1(c) GDPR).\r\n\r\n- The CNIL stressed that hidden camera use in the workplace is only permissible under **strict conditions**: temporary use, transparency when possible, prior justification, documented safeguards, and respect for employee privacy.\r\n\r\n### Lessons for businesses\r\n\r\nThis sanction underscores that even in difficult contexts (e.g., combating theft), surveillance must be tightly justified, documented, and proportionate. Hidden monitoring without due process or employee notice risks serious regulatory consequences.\r\n\r\n> 👉 For more information, read the deliberation [here](https://www.legifrance.gouv.fr/cnil/id/CNILTEXT000052266505).\r\n\r\n## Disney to pay $10M to settle FTC allegations over children’s data\r\n\r\nDisney has agreed to a[ **$10 million settlement** with the **U.S. Federal Trade Commission (FTC)** over claims that it enabled **unlawful collection of children’s personal data**. ](https://www.ftc.gov/news-events/news/press-releases/2025/09/disney-pay-10-million-settle-ftc-allegations-company-enabled-unlawful-collection-childrens-personal)The FTC alleged violations of the **Children’s Online Privacy Protection Act (COPPA)** regarding data collected in Disney’s Kids Mode mobile apps.According to the complaint, Disney collected persistent identifiers and other personal data from children without valid consent and used them for behavioral advertising. The settlement requires Disney to **delete the improperly collected data**, maintain a **compliance program**, and submit to **third-party privacy audits** for the next 20 years.","\u003Cp>Tired of general newsletters that skim over your real concerns? \u003Cstrong>DastraNews,\u003C/strong> offers legal and regulatory monitoring \u003Cstrong>specifically designed for DPOs, lawyers, and privacy professionals\u003C/strong>.\u003C/p>\r\n\u003Cp>Each month, we go beyond a simple recap: we select about ten decisions, news, or positions \u003Cstrong>that have a concrete impact on your missions and organizations\u003C/strong>.\u003C/p>\r\n\u003Cp>🎯 \u003Cstrong>Targeted, useful monitoring grounded in the real-world realities of data protection and AI.\u003C/strong>\u003C/p>\r\n\u003Cp>Here is our selection for \u003Cstrong>September 2025:\u003C/strong>\u003C/p>\r\n\u003Ch2 id=\"eu-consultation-on-ai-transparency-guidelines\">EU consultation on AI Transparency Guidelines\u003C/h2>\r\n\u003Cp>The \u003Cstrong>European Commission\u003C/strong> launched a public consultation (deadline: \u003Cstrong>9 October 2025\u003C/strong>) \u003Ca href=\"https://digital-strategy.ec.europa.eu/en/news/commission-launches-consultation-develop-guidelines-and-code-practice-transparent-ai-systems\" rel=\"nofollow\">to develop \u003Cstrong>guidelines and a code of practice for transparent AI systems\u003C/strong>,\u003C/a> drawing from the transparency provisions in the \u003Cstrong>AI Act\u003C/strong>.\u003Cbr />\r\n\u003Cbr />\r\nUnder the AI Act, deployers and providers of \u003Cstrong>generative AI\u003C/strong>, \u003Cstrong>emotion recognition\u003C/strong>, \u003Cstrong>biometric categorization\u003C/strong>, and manipulated content systems must disclose to users when they interact with AI, or when content is AI-generated or manipulated. \u003Cbr />\r\n\u003Cbr />\r\nThe consultation invites AI developers, public bodies, research groups, civil society, and citizens to contribute views. The transparency obligations are set to apply starting \u003Cstrong>2 August 2026\u003C/strong>.\u003C/p>\r\n\u003Cp>\u003Cstrong>Implications for practitioners:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Organizations must prepare to embed transparency disclosures (AI labels, metadata, explanation) in their systems.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Participating in the consultation is a chance to influence how the obligations will be defined and enforced.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The upcoming code of practice may become a de facto standard or a benchmark in audits, litigation, and enforcement.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Chr />\r\n\u003Ch2 id=\"eu-u.s-data-flows-dpf-validated-by-general-court\">EU-U.S data flows: DPF validated by General Court\u003C/h2>\r\n\u003Cp>On \u003Ca href=\"https://curia.europa.eu/jcms/upload/docs/application/pdf/2025-09/cp250106en.pdf\" rel=\"nofollow\">\u003Cstrong>3 September 2025\u003C/strong>, the General Court of the European Union dismissed \u003C/a>the annulment action brought by MP Philippe Latombe against the \u003Cstrong>EU–U.S. Data Privacy Framework (DPF)\u003C/strong>. \u003Cbr />\r\nKey findings of the Court:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>The Court chose to rule \u003Cstrong>on the merits\u003C/strong>, bypassing debates about Latombe’s standing.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The decision validates that the DPF provides a level of protection \u003Cstrong>“essentially equivalent”\u003C/strong> to EU standards, aligning with the principle in GDPR Article 45.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The \u003Cstrong>Data Protection Review Court (DPRC)\u003C/strong> as set up under DPF was judged sufficiently independent to satisfy redress requirements.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Regarding bulk collection by U.S. intelligence, the Court recognized that ex post review and procedural safeguards under \u003Cstrong>Executive Order 14086\u003C/strong> are consistent with EU jurisprudence on surveillance (Schrems II).\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>\u003Cstrong>What this means for data controllers and processors:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>The DPF remains a valid mechanism for EU → U.S. transfers under GDPR’s Chapter V.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Nevertheless, organizations should maintain fallback strategies: \u003Cstrong>SCCs/BCRs, detailed Transfer Impact Assessments (TIAs)\u003C/strong>, especially for sensitive flows.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Monitor U.S. legal and oversight changes: the Court emphasized that the Commission must continuously evaluate whether U.S. practices remain aligned with the framework.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>An appeal to the \u003Cstrong>CJEU\u003C/strong> is possible.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>👉 For more information, read our article \u003Ca href=\"https://www.dastra.eu/en/article/eu-us-data-flows-secured-by-court-ruling/59566\">here\u003C/a>.\u003C/p>\r\n\u003C/blockquote>\r\n\u003Chr />\r\n\u003Ch2 id=\"cjeu-ruling-pseudonymized-data-is-not-always-personal\">CJEU ruling: pseudonymized data is not always personal\u003C/h2>\r\n\u003Cp>On \u003Ca href=\"https://curia.europa.eu/juris/document/document.jsf?text=&amp;docid=303863&amp;pageIndex=0&amp;doclang=EN&amp;mode=req&amp;dir=&amp;occ=first&amp;part=1&amp;cid=17376403\" rel=\"nofollow\">\u003Cstrong>4 September 2025\u003C/strong>, in case C-413/23 (EDPS v SRB), the Court of Justice of the European Union \u003C/a>clarified that \u003Cstrong>pseudonymized data may, in certain circumstances, be considered as non-personal\u003C/strong> by recipients. \u003Cbr />\r\n\u003Cbr />\r\nKey clarifications from the judgment:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>The Court emphasized that the \u003Cstrong>recipient’s perspective\u003C/strong> matters: if the recipient cannot reasonably re-identify data subjects, pseudonymized data may fall outside the scope of personal data.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Identifiability must be assessed based on \u003Cstrong>realistic technical and organizational means\u003C/strong>, not theoretical possibility.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The \u003Cstrong>controller's obligations\u003C/strong> (transparency, information to data subjects at time of collection) still apply regardless of how data may later appear to recipients.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>This marks a shift from the “absolute” view (supported by EDPS/EDPB) toward a \u003Cstrong>contextual, risk-based approach\u003C/strong> to pseudonymization.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>\u003Cstrong>Practical takeaways for DPOs &amp; legal teams:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Separate the identifying information (keys) and limit access to reinforce non-identifiability.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>When sharing or transferring data, perform a recipient-level assessment: what means do they have to re-identify?\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>👉 For more information, read our article \u003Ca href=\"https://www.dastra.eu/en/article/cjeu-are-pseudonymised-data-always-personal/59571\">here\u003C/a>.\u003C/p>\r\n\u003C/blockquote>\r\n\u003Chr />\r\n\u003Ch2 id=\"cnil-imposes-major-fines-on-google-shein-for-cookie-violations\">CNIL imposes major fines on Google &amp; SHEIN for cookie violations\u003C/h2>\r\n\u003Cp>On\u003Ca href=\"https://www.cnil.fr/fr/regulation-des-cookies-la-cnil-poursuit-le-plan-daction-initie-en-2019-et-prononce-deux-amendes\" rel=\"nofollow\"> \u003Cstrong>3 September 2025\u003C/strong>, the CNIL announced significant sanctions: \u003C/a>\u003Cstrong>Google received €325 million\u003C/strong>, \u003Cstrong>SHEIN €150 million\u003C/strong> for breaches of e-privacy rules concerning cookies and trackers. These fines are part of CNIL’s long-term \u003Cstrong>action plan launched in 2019\u003C/strong> to enforce stricter compliance. \u003Cbr />\r\n\u003Cbr />\r\nKey violations include:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Using trackers without valid prior consent (violating consent and information rules)\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>“Cookie walls” (requiring acceptance of trackers for access) deemed acceptable only if the user has a \u003Cstrong>real choice\u003C/strong> and alternatives are balanced and equally accessible.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Google was also held in breach for sending advertising emails based on user data within Gmail without proper consent (violating the CPCE, art. L. 34-5)\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>\u003Cstrong>Takeaways for operators &amp; marketers:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Cookie compliance: regulatory attention remains high.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Avoid dark patterns, forced consent, or opaque cookie walls.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Review your consent, logging, audit trail, and cookie walls carefully. Documentation is key if challenged.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Chr />\r\n\u003Ch2 id=\"brazil-adequacy-decision-in-the-works\">Brazil: adequacy decision in the works\u003C/h2>\r\n\u003Cp>The \u003Cstrong>European Commission\u003C/strong> has released a\u003Ca href=\"https://commission.europa.eu/document/download/f5aee532-70bf-41b1-a94a-8e294a528f6a_en?filename=Draft%20Adequacy%20Decision%20-%20Brazil%20-%20LGPD%20-%20FINAL%20-%20September%202025.pdf\" rel=\"nofollow\"> \u003Cstrong>preliminary adequacy decision\u003C/strong> recognizing that Brazil’s data protection framework (LGPD)\u003C/a> ensures a level of protection equivalent to EU standards. Once finalized, this will allow \u003Cstrong>free and secure data flows\u003C/strong> between the EU and Brazil without extra safeguards.\u003C/p>\r\n\u003Cp>\u003Ca href=\"https://www.gov.br/anpd/pt-br/assuntos/noticias/european-union-releases-preliminary-version-of-adequacy-decision\" rel=\"nofollow\">On the Brazilian side\u003C/a>, the \u003Cstrong>ANPD\u003C/strong> is finalizing its own adequacy process to recognize EU law as equivalent. The mutual recognition will strengthen \u003Cstrong>citizens’ rights\u003C/strong>, increase \u003Cstrong>legal certainty\u003C/strong>, simplify \u003Cstrong>international business operations\u003C/strong>, and boost \u003Cstrong>trade competitiveness\u003C/strong>.\u003C/p>\r\n\u003Cp>The process now moves to the \u003Cstrong>European Data Protection Board\u003C/strong> for an opinion, followed by approval from EU member states. If adopted, Brazil will join 16 other jurisdictions (including the UK, Canada, Japan, and South Korea) already deemed adequate.\u003C/p>\r\n\u003Ch2 id=\"edpb-issues-first-guidelines-on-the-interplay-between-the-dsa-gdpr\">EDPB issues first guidelines on the interplay between the DSA &amp; GDPR\u003C/h2>\r\n\u003Cp>The \u003Cstrong>European Data Protection Board (EDPB)\u003C/strong> has published its \u003Ca href=\"https://www.edpb.europa.eu/system/files/2025-09/edpb_guidelines_202503_interplay-dsa-gdpr_v1_en.pdf\" rel=\"nofollow\">\u003Cstrong>first guidelines\u003C/strong> clarifying the relationship between the \u003Cstrong>Digital Services Act (DSA)\u003C/strong> and the \u003Cstrong>GDPR\u003C/strong>\u003C/a>. The \u003Cstrong>DSA\u003C/strong>, which governs online platforms and search engines, aims to create a safer digital environment and safeguard fundamental rights. Many of its obligations involve processing personal data, raising overlaps with the \u003Cstrong>GDPR\u003C/strong>.\u003C/p>\r\n\u003Ch3 id=\"key-takeaways-from-the-guidelines\">Key Takeaways from the Guidelines\u003C/h3>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>\u003Cstrong>No hierarchy of laws\u003C/strong>: The DSA imposes obligations on platforms (e.g., content moderation, diligence duties, algorithmic transparency), but these do not override or replace GDPR obligations. Data protection rules remain fully applicable.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Legal basis &amp; purposes\u003C/strong>: Any data processing carried out under the DSA must still rely on a GDPR-compliant legal basis (such as consent or legitimate interest). The DSA does not create a new automatic ground for processing personal data.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Shared responsibilities\u003C/strong>: Roles between platforms, hosting providers, intermediaries, and other actors must be clearly defined to determine who is the data controller or processor in different scenarios (moderation, recommender systems, profiling, etc.).\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Transparency &amp; information duties\u003C/strong>: The DSA’s transparency requirements (explaining algorithms, moderation criteria, reporting obligations) must be coordinated with GDPR information rights (purpose, access, retention).\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The EDPB highlights the need for \u003Cstrong>closer cooperation between\u003C/strong> Digital Services Coordinators, the European Commission, and Data Protection Authorities to ensure legal certainty for companies and stronger protection of users’ rights.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Ch3 id=\"why-it-matters\">Why it matters\u003C/h3>\r\n\u003Cp>Companies cannot assume compliance with one regime ensures compliance with the other. Instead, both must be reconciled through consistent governance, risk management, and user communication. This dual compliance challenge raises the stakes for platforms, which face oversight not only from data protection authorities but also from digital services regulators.\u003C/p>\r\n\u003Ch2 id=\"cnil-and-inria-strengthen-partnership-on-data-protection-and-algorithm-evaluation\">\u003Cstrong>CNIL and Inria strengthen partnership on data protection and algorithm evaluation\u003C/strong>\u003C/h2>\r\n\u003Cp>The \u003Ca href=\"https://www.cnil.fr/fr/cnil-inria-partenariat-renforce-protection-donnees-evaluation-algorithmes?mkt_tok=MTM4LUVaTS0wNDIAAAGdSgArWzx1re5L4Z35JQYMHymtvmbjBFdfoCoMoFz03SAeWg4dGr2FhjHtyhTwo5thkKST71wmqSdryNG_y11cXwrWl0urlinZqgYfCSAy9yj4Fw\" rel=\"nofollow\">\u003Cstrong>CNIL\u003C/strong> (French Data Protection Authority) and \u003Cstrong>Inria\u003C/strong> (French national institute for research in digital science) have signed a renewed cooperation agreement \u003C/a>to deepen joint efforts in data protection, privacy and algorithmic evaluation. \u003Cbr />\r\n\u003Cbr />\r\nThe partnership builds upon over ten years of collaboration, but now aligns more closely with the evolving regulatory and technological challenges in Europe, notably those posed by artificial intelligence.\u003C/p>\r\n\u003Cp>Together, CNIL and Inria will coordinate research, produce shared tools and guidance, organize training and public outreach, and co-supervise doctoral and postdoctoral work. \u003Cbr />\r\n\u003Cbr />\r\nOne focus will be the \u003Cstrong>new Institut national pour l’évaluation et la sécurité de l’intelligence artificielle (INESIA)\u003C/strong>, with partners such as ANSSI and others.\u003C/p>\r\n\u003Ch2 id=\"commission-proposes-guidance-and-template-for-serious-ai-incidents-and-launches-consultation\">\u003Cstrong>Commission proposes guidance and template for serious AI incidents and launches consultation\u003C/strong>\u003C/h2>\r\n\u003Cp>The European Commission has published a\u003Ca href=\"https://digital-strategy.ec.europa.eu/en/consultations/ai-act-commission-issues-draft-guidance-and-reporting-template-serious-ai-incidents-and-seeks?\" rel=\"nofollow\"> \u003Cstrong>draft guidance document\u003C/strong> and a \u003Cstrong>reporting template\u003C/strong> \u003C/a>for serious incidents involving AI systems under the AI Act, and is seeking input through a public consultation.\u003Cbr />\r\n\u003Cbr />\r\nThe draft guidance clarifies when an event should be reported as a “serious incident,” what information must be included, and how the reporting process should operate. The template aims to harmonize the format and level of detail across Member States to ensure consistent incident handling and oversight.\u003Cbr />\r\n\u003Cbr />\r\nThe initiative underscores the Commission’s effort to give operational clarity to the AI Act’s obligations on incident reporting and monitoring. The consultation is open until a specified deadline.\u003C/p>\r\n\u003Ch2 id=\"austrian-court-clarifies-article-22-gdpr-in-amas-algorithm-case\">Austrian court clarifies Article 22 GDPR in AMAS algorithm case\u003C/h2>\r\n\u003Cp>The\u003Ca href=\"https://www.ris.bka.gv.at/Dokument.wxe?ResultFunctionToken=fbc4b935-fe77-4c28-8771-0079a0d41487&amp;Position=1&amp;SkipToDocumentPage=True&amp;Abfrage=Bvwg&amp;Entscheidungsart=Undefined&amp;SucheNachRechtssatz=\" rel=\"nofollow\"> Austrian Federal Administrative Court has overturned a ban on the AMAS algorithm\u003C/a>, used by Austria’s public employment service (AMS) to assess jobseekers’ labour market prospects. \u003Cbr />\r\n\u003Cbr />\r\nThe court found a valid legal basis in the Labour Market Service Act, which specifies what data can be processed and for what purposes, thereby meeting GDPR Articles 6 and 9 requirements. \u003Cbr />\r\n\u003Cbr />\r\nCrucially, the judges held that AMAS did not amount to prohibited automated decision-making under Article 22 GDPR:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>The court drew a clear distinction from the CJEU’s \u003Cstrong>SCHUFA\u003C/strong> judgment (C-634/21). It acknowledged that AMAS carries out profiling and that its categorisation amounts to a form of “decision.”\u003C/li>\r\n\u003Cli>However, Article 22 applies only where decisions are made \u003Cem>solely\u003C/em> by automated means and have legal or similarly significant effects.\u003C/li>\r\n\u003Cli>In this case, AMS counsellors played a substantive role rather than a purely formal one. They were required to review and discuss the algorithmic output with jobseekers, take into account additional personal circumstances, correct the algorithm’s outcome where necessary, and ultimately make the final classification themselves.\u003C/li>\r\n\u003C/ul>\r\n\u003Ch2 id=\"the-data-act-goes-live\">The Data Act goes live\u003C/h2>\r\n\u003Cp>On \u003Cstrong>12 September 2025\u003C/strong>, the \u003Ca href=\"https://eur-lex.europa.eu/eli/reg/2023/2854/oj/eng\" rel=\"nofollow\">EU’s \u003Cstrong>Data Act (Regulation 2023/2854)\u003C/strong>\u003C/a> transitions from a future framework into reality. While it had entered into force in January 2024, the regulatory obligations now become operational and enforceable.\u003C/p>\r\n\u003Cp>Under the Data Act, any data generated by a connected product or a related service must be accessible by the user. Access must be timely, free of charge, in structured, machine-readable formats, and in real time when technically feasible.\u003C/p>\r\n\u003Cp>The new regulation also embeds obligations on \u003Cstrong>contractual transparency\u003C/strong>: sellers or lessors of connected products must inform users — prior to contract — of what data will be generated, how it will be stored, and when and how it can be accessed. In B2B and B2G contexts, data sharing must occur on \u003Cstrong>fair, reasonable, and non-discriminatory terms\u003C/strong>.\u003C/p>\r\n\u003Cp>Moreover, the Data Act includes rules to promote competition via \u003Cstrong>cloud switching\u003C/strong>: providers must remove technical and contractual barriers. From 12 January 2027, migration fees will no longer be allowed, and before then any fees must be limited to the provider’s internal costs.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>👉 For more information, read our article \u003Ca href=\"https://www.dastra.eu/en/article/the-data-act-goes-live-what-now/59584\">here\u003C/a>.\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"cnil-fines-la-samaritaine-for-hidden-cameras-in-employee-areas\">CNIL fines La Samaritaine for hidden cameras in employee Areas\u003C/h2>\r\n\u003Cp>On \u003Cstrong>18 September 2025\u003C/strong>, \u003Ca href=\"https://www.cnil.fr/fr/cameras-dissimulees-la-cnil-sanctionne-la-samaritaine?\" rel=\"nofollow\">the \u003Cstrong>CNIL\u003C/strong> imposed a \u003Cstrong>€100,000 fine\u003C/strong> on \u003Cstrong>Samaritaine SAS\u003C/strong> \u003C/a>for having hidden surveillance cameras in two storage rooms of its store. The cameras had the appearance of smoke detectors and recorded audio, effectively spying on staff. The device was installed in August 2023, discovered by employees, then removed in September 2023.\u003C/p>\r\n\u003Ch3 id=\"key-findings-and-violations\">Key findings and violations\u003C/h3>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Samaritaine failed to carry out any \u003Cstrong>prior analysis of GDPR compatibility\u003C/strong> or document the exceptional nature of hidden cameras.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The cameras were not declared in the processing register or impact assessments, and the DPO was informed only after installation.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The audio recording was deemed excessive, breaching the principle of data minimization (article 5.1(c) GDPR).\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The CNIL stressed that hidden camera use in the workplace is only permissible under \u003Cstrong>strict conditions\u003C/strong>: temporary use, transparency when possible, prior justification, documented safeguards, and respect for employee privacy.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Ch3 id=\"lessons-for-businesses\">Lessons for businesses\u003C/h3>\r\n\u003Cp>This sanction underscores that even in difficult contexts (e.g., combating theft), surveillance must be tightly justified, documented, and proportionate. Hidden monitoring without due process or employee notice risks serious regulatory consequences.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>👉 For more information, read the deliberation \u003Ca href=\"https://www.legifrance.gouv.fr/cnil/id/CNILTEXT000052266505\" rel=\"nofollow\">here\u003C/a>.\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"disney-to-pay-10m-to-settle-ftc-allegations-over-childrens-data\">Disney to pay $10M to settle FTC allegations over children’s data\u003C/h2>\r\n\u003Cp>Disney has agreed to a\u003Ca href=\"https://www.ftc.gov/news-events/news/press-releases/2025/09/disney-pay-10-million-settle-ftc-allegations-company-enabled-unlawful-collection-childrens-personal\" rel=\"nofollow\"> \u003Cstrong>$10 million settlement\u003C/strong> with the \u003Cstrong>U.S. Federal Trade Commission (FTC)\u003C/strong> over claims that it enabled \u003Cstrong>unlawful collection of children’s personal data\u003C/strong>. \u003C/a>\u003Cbr />\r\n\u003Cbr />\r\nThe FTC alleged violations of the \u003Cstrong>Children’s Online Privacy Protection Act (COPPA)\u003C/strong> regarding data collected in Disney’s Kids Mode mobile apps.\u003Cbr />\r\n\u003Cbr />\r\nAccording to the complaint, Disney collected persistent identifiers and other personal data from children without valid consent and used them for behavioral advertising. The settlement requires Disney to \u003Cstrong>delete the improperly collected data\u003C/strong>, maintain a \u003Cstrong>compliance program\u003C/strong>, and submit to \u003Cstrong>third-party privacy audits\u003C/strong> for the next 20 years.\u003C/p>\r\n","DastraNews: what happened in Privacy & AI in September?? ","Privacy & AI insights from the Dastra hub: actionable updates for pros who work daily in the field.",2302,13,"DastraNews: what happened in Privacy & AI in September? ",0,null,"en","dastranews-what-happened-in-september",false,"Published",{"id":19,"displayName":20,"avatarUrl":21,"bio":13,"blogUrl":13,"color":13,"userId":19,"creationDate":22},20352,"Leïla Sayssa","https://static.dastra.eu/tenant-3/avatar/20352/TDYeY3C8Rz1lLE/dpo-avatar-h01-150.png","2025-03-03T11:08:22","2025-10-06T11:15:15.724","2025-10-06T13:15:15.2913252","2025-12-09T09:11:00.0246024",{"id":27,"name":28,"description":29,"url":30,"color":31,"parentId":13,"count":13,"imageUrl":13,"parent":13,"order":12,"translations":32},2,"Blog","A list of curated articles provided by the community","blog","#28449a",[33,36,39],{"lang":34,"name":28,"description":35},"fr","Une liste d'articles rédigés par la communauté",{"lang":37,"name":28,"description":38},"es","Una lista de artículos escritos por la comunidad",{"lang":40,"name":28,"description":41},"de","Eine Liste von Artikeln, die von der Community verfasst wurden",[43],{"id":27,"name":28,"description":29,"url":30,"color":31,"parentId":13,"count":13,"imageUrl":13,"parent":13,"order":12,"translations":44},[45,46,47],{"lang":34,"name":28,"description":35},{"lang":37,"name":28,"description":38},{"lang":40,"name":28,"description":41},[],"https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-original.jpg",[51,52,53,54,55,56,57],"https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-1000.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-1500.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-800.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-600.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-300.webp","https://static.dastra.eu/content/74f891e7-4519-4945-b34f-e5afe6ecb5b1/dastractu-100.webp",59633]