News
The destruction of the algorithm: the new sanction for breaching the GDPR?
Since 2019, The Federlñ Trade Commission (FTC) has been seeking ways to punish the new digital unfair practices, consisting on illegally obtaining personal information from Internet users and exploit it with artificial intelligence tools. In this case, the punishment consists on what is called “the destruction of the algorithm”.

Since the resurgence of Artificial Intelligence, one of the main debates has been about the ethics of the algorithm. Indeed, many advocate a future based on human principles and values, which, by extension, they propose to apply to the behavior of robots. But few are those who propose effective sanctions for companies that unlawfully use the personal data with which they train algorithms.
An example of this is the US Federal Trade Commission (FTC), which since 2019 has been seeking ways to punish this type of new digital unfair practices, consisting of illegally obtaining personal information from Internet users to exploit it with artificial intelligence tools.
One of these measures consists of prohibiting the use of the result of the illicit activity, which has been called "the destruction of the algorithm". Through this measure, the regulator urges the cessation of the use of the algorithm, as a technological asset resulting from the unlawful activity, because it is in it where the economic value lies.
This sanction substitutes or complements the sanction requiring the deletion of the data obtained, since it is not effective. This is because, even if we delete the illegally obtained information, the algorithm has already learned from it, achieving the desired objective. This is known as the "algorithmic shadow". On the other hand, if the sanction involves the elimination of the results, a priori this can have an effective deterrent impact on those companies that do not apply a sufficient degree of diligence in the use of this technology.
As an example, on March 4, 2022, the FTC reached a settlement in the sanctioning procedure initiated against the well-known company Weight Watchers, for the operation of its Kurbo application, an app that offered advice on healthy eating. Due to the omission of sufficient control measures, the application allowed minors to register (information from 8-year-old children was detected), which led them to provide personal information without their parents' consent and, therefore, in an illegal manner. The resolution of the American regulator forced the company to delete the illegally obtained data and, in addition to a fine of 1.5 million dollars, ordered it to remove the algorithms or other artificial intelligence models applied to this activity or obtained thanks to it.
In 2019, the FTC already innovated when intervening in the Cambridge Analytica scandal, when it was shown that such platform did not sufficiently protect the privacy of its users. In this case, said agency forced the company to delete all the information it had illegally collected from Facebook users, which included the algorithms used for that purpose and, also, the results obtained through that practice.
Shortly thereafter, the FTC was confronted with another case, this time involving the company Everalbum. This company was the owner of a photo-sharing application, which was accused of using facial recognition without users' authorization and without offering them the possibility to object to it.
In this case, the Commission forced Everalbum to delete all photographs, videos and biometric data obtained through its application, and to remove "any models or algorithms developed, in whole or in part" using such data.
By way of "algorithmic justice," the premise of the FTC's approach seems clear: do not allow companies that violate data protection to enrich themselves from the illicit use of the personal information they obtain, either directly through its exploitation or through its use to create or train their algorithms. In short, the objective pursued is to send a message to companies that are considering breaking the law: think twice, because it does not pay.
This situation raises an interesting legal debate, since in the evolution of the data economy in which we live, algorithms are fundamental processing tools, which are protected, among others, by the rules on business secrets and intellectual property. However, it is no less true that the use of algorithms may not infringe data protection regulations, nor may they violate the privacy laws.
Article provided by INPLP member: Francisco Perez Bes and Esmeralda Saracíbar (ECIX Group, Spain)
Discover more about the INPLP and the INPLP-Members
Dr. Tobias Höllwarth (Managing Director INPLP)
News Archiv
- Alle zeigen
- Mai 2023
- April 2023
- März 2023
- Februar 2023
- Jänner 2023
- Dezember 2022
- November 2022
- Oktober 2022
- September 2022
- August 2022
- Juli 2022
- Mai 2022
- April 2022
- März 2022
- Februar 2022
- November 2021
- September 2021
- Juli 2021
- Mai 2021
- April 2021
- Dezember 2020
- November 2020
- Oktober 2020
- Juni 2020
- März 2020
- Dezember 2019
- Oktober 2019
- September 2019
- August 2019
- Juli 2019
- Juni 2019
- Mai 2019
- April 2019
- März 2019
- Februar 2019
- Jänner 2019
- Dezember 2018
- November 2018
- Oktober 2018
- September 2018
- August 2018
- Juli 2018
- Juni 2018
- Mai 2018
- April 2018
- März 2018
- Februar 2018
- Dezember 2017
- November 2017
- Oktober 2017
- September 2017
- August 2017
- Juli 2017
- Juni 2017
- Mai 2017
- April 2017
- März 2017
- Februar 2017
- November 2016
- Oktober 2016
- September 2016
- Juli 2016
- Juni 2016
- Mai 2016
- April 2016
- März 2016
- Februar 2016
- Jänner 2016
- Dezember 2015
- November 2015
- Oktober 2015
- September 2015
- August 2015
- Juli 2015
- Juni 2015
- Mai 2015
- April 2015
- März 2015
- Februar 2015
- Jänner 2015
- Dezember 2014
- November 2014
- Oktober 2014
- September 2014
- August 2014
- Juli 2014
- Juni 2014
- Mai 2014
- April 2014
- März 2014
- Februar 2014
- Jänner 2014
- Dezember 2013
- November 2013
- Oktober 2013
- September 2013
- August 2013
- Juli 2013
- Juni 2013
- Mai 2013
- April 2013
- März 2013
- Februar 2013
- Jänner 2013
- Dezember 2012
- November 2012
- Oktober 2012
- September 2012
- August 2012
- Juli 2012
- Juni 2012
- Mai 2012
- April 2012
- März 2012
- Februar 2012
- Jänner 2012
- Dezember 2011
- November 2011
- Oktober 2011
- September 2011
- Juli 2011
- Juni 2011
- Mai 2011
- April 2011
- März 2011
- Februar 2011
- Jänner 2011
- November 2010
- Oktober 2010
- September 2010
- Juli 2010