News
National AI legislation adopted in Italy: a first look at privacy implications
Italy has become the first EU Member State to adopt a comprehensive national AI regulatory framework with Law No. 132 of 23 September 2025. The law adapts the Italian legal system to EU Regulation 2024/1689 (AI Act), introducing provisions that overlap with existing data protection regulation, particularly affecting healthcare data processing, worker information duties and transparency obligations in public administration.
1. Introduction: the anthropocentric approach
Law 132/2025 establishes guiding principles that must inspire the entire national regulatory framework, in line with the European AI Act, including transparency, proportionality, robustness, accuracy, non-discrimination, protection of personal data and fundamental rights, sustainability and human responsibility.
It is a framework law, which does not yet contain detailed technical rules, but entrusts the Government with the task of adopting a series of implementing decrees. At the heart of the law is a declaredly anthropocentric approach: AI is considered a tool to support humans, never a substitute for them.
The law contemplates, in addition to the declaration of principles, provisions that apply in various economic sectors overlapping with the regulation currently in force, especially as regards personal data protection.
2. Sector-specific rules
Law No. 132/2025 adopts a structured regulatory approach across several sectors in which AI can deliver tangible benefits to citizens and public institutions.
2.1 Healthcare and research
Within the regulatory framework outlined, AI is valued as a tool to support diagnosis, prevention and scientific research, without replacing the decision-making role of the doctor, who remains central to the treatment process.
In this context, Article 8 qualifies as being of significant public interest the processing of personal data, including special categories of data referred to in Article 9 of the GDPR, carried out by public and private entities, including IRCCS (Scientific Institutes for Research, Hospitalisation and Healthcare) and private entities participating in research projects. Such processing is aimed at creating AI systems for diagnosis and treatment, drug development and rehabilitation technologies, in compliance with European data protection guarantees.
The secondary use of such data, without direct identifying elements, is expressly authorised without the need for further consent from the data subject, if this was initially required by law. The obligation to provide adequate information, even in general form (e.g. via the organisation's website), remains unaffected, except where knowledge of the data subject's identity is unavoidable or necessary for health protection.
2.2 Labour
Article 11 states that AI must be used to improve workers' conditions, protect their physical and mental integrity and enhance the quality of their performance, never for the purpose of control or restriction of rights. The use of AI must be safe, reliable and transparent, without undermining human dignity or violating the confidentiality of personal data.
Employers are required to inform workers in advance about the use of AI systems, in accordance with the procedures set out in Article 1-bis of Legislative Decree 152/1997, entitled "Additional information requirements in the case of the use of automated decision-making or monitoring systems".
The Italian Data Protection Authority expressed critical views regarding this provision, observing that it did not expressly recall the protections provided by the GDPR (Articles 22(3) and 88, in particular) and the "Privacy Code" (Articles 113 and 114). The Authority considered the reference to Article 1-bis, Legislative Decree No. 152/1997 problematic as it refers "only to fully automated processing".
2.3 Public administration and justice
The use of AI systems in public administration is subject to compliance with principles of transparency, knowability of operation and traceability of use, for the protection of data subjects, as well as adoption of adequate technical, organisational and training measures aimed at ensuring responsible and informed use of the technologies.
In the judicial sphere, Article 15 sets out a key principle: even in the presence of AI systems, every decision remains the prerogative of the magistrate, who retains exclusive competence for the interpretation and application of the law, the assessment of facts and evidence, and the adoption of measures. This decision-making reserve clearly delimits the scope of AI use, ensuring that algorithmic tools do not affect the essential prerogatives of the judicial function or the independence of the judiciary.
3. Criminal safeguards against AI abuse
Article 26 introduces new types of offences and specific aggravating circumstances related to the use of AI systems. The provision transforms artificial intelligence into a relevant element for the purposes of criminal liability.
Article 61 of the Italian Criminal Code is supplemented with a new general aggravating circumstance: the use of artificial intelligence systems becomes an aggravating circumstance when it constitutes an insidious means, or hinders public or private defence, or when it aggravates the consequences of the offence.
The new Article 612-quater of the Italian Criminal Code defines the offence of unlawful dissemination of content generated or altered using artificial intelligence systems, targeting the non-consensual dissemination of falsified images, videos or voices, capable of misleading as to their authenticity, when such conduct results in unjust damage to the offended person. This is a direct response to the growing spread of deepfakes, with enhanced protection in cases involving vulnerable individuals or public authorities.
Overall, Article 26 sends an unequivocal message to companies, developers and users of artificial intelligence systems: technological innovation cannot translate into a free zone from a criminal law perspective. The use of AI therefore becomes a real legal risk factor, requiring the adoption of adequate compliance, technological governance and algorithm control measures.
4. Investments for innovation and competitiveness
To provide concrete support for the adoption of AI, the measure also activates a €1 billion investment programme for start-ups and SMEs operating in the fields of artificial intelligence, cyber security and emerging technologies, strengthening the technological development of strategic supply chains with a high social impact.
This investment strategy aims to position Italy as a competitive player in the European AI landscape whilst ensuring that innovation develops within the anthropocentric and rights-protective framework established by the law.
5. Conclusion
Law No. 132/2025 presents multiple profiles of interest and peculiarities such as to fuel the already substantial doctrinal discussions that normally welcome provisions which, as in this case, can be said to be historic. For a complete evaluation it will be necessary to await implementing acts and especially the test of concrete experience.
For privacy law practitioners, the legislation creates a complex multi-layered compliance environment where it will not be easy to identify which guarantees to apply in the individual concrete case. For managers, entrepreneurs and industry professionals, the challenge will be to transform constraints and risks into competitive opportunities: those who know how to integrate ethics, compliance and innovation, anticipating the rules, will have a strategic advantage.
Article provided by INPLP member: Chiara Agostini (RP Legal & Tax, Italy)
Discover more about the INPLP and the INPLP-Members
Dr. Tobias Höllwarth (Managing Director INPLP)
News Archiv
- Alle zeigen
- Jänner 2026
- Dezember 2025
- November 2025
- Oktober 2025
- September 2025
- August 2025
- Juli 2025
- Juni 2025
- Mai 2025
- April 2025
- März 2025
- Februar 2025
- Jänner 2025
- Dezember 2024
- November 2024
- Oktober 2024
- September 2024
- August 2024
- Juli 2024
- Juni 2024
- Mai 2024
- April 2024
- März 2024
- Februar 2024
- Jänner 2024
- Dezember 2023
- November 2023
- Oktober 2023
- September 2023
- August 2023
- Juli 2023
- Juni 2023
- Mai 2023
- April 2023
- März 2023
- Februar 2023
- Jänner 2023
- Dezember 2022
- November 2022
- Oktober 2022
- September 2022
- August 2022
- Juli 2022
- Mai 2022
- April 2022
- März 2022
- Februar 2022
- November 2021
- September 2021
- Juli 2021
- Mai 2021
- April 2021
- Dezember 2020
- November 2020
- Oktober 2020
- Juni 2020
- März 2020
- Dezember 2019
- Oktober 2019
- September 2019
- August 2019
- Juli 2019
- Juni 2019
- Mai 2019
- April 2019
- März 2019
- Februar 2019
- Jänner 2019
- Dezember 2018
- November 2018
- Oktober 2018
- September 2018
- August 2018
- Juli 2018
- Juni 2018
- Mai 2018
- April 2018
- März 2018
- Februar 2018
- Dezember 2017
- November 2017
- Oktober 2017
- September 2017
- August 2017
- Juli 2017
- Juni 2017
- Mai 2017
- April 2017
- März 2017
- Februar 2017
- November 2016
- Oktober 2016
- September 2016
- Juli 2016
- Juni 2016
- Mai 2016
- April 2016
- März 2016
- Februar 2016
- Jänner 2016
- Dezember 2015
- November 2015
- Oktober 2015
- September 2015
- August 2015
- Juli 2015
- Juni 2015
- Mai 2015
- April 2015
- März 2015
- Februar 2015
- Jänner 2015
- Dezember 2014
- November 2014
- Oktober 2014
- September 2014
- August 2014
- Juli 2014
- Juni 2014
- Mai 2014
- April 2014
- März 2014
- Februar 2014
- Jänner 2014
- Dezember 2013
- November 2013
- Oktober 2013
- September 2013
- August 2013
- Juli 2013
- Juni 2013
- Mai 2013
- April 2013
- März 2013
- Februar 2013
- Jänner 2013
- Dezember 2012
- November 2012
- Oktober 2012
- September 2012
- August 2012
- Juli 2012
- Juni 2012
- Mai 2012
- April 2012
- März 2012
- Februar 2012
- Jänner 2012
- Dezember 2011
- November 2011
- Oktober 2011
- September 2011
- Juli 2011
- Juni 2011
- Mai 2011
- April 2011
- März 2011
- Februar 2011
- Jänner 2011
- November 2010
- Oktober 2010
- September 2010
- Juli 2010
