News
Responsible use of AI based tools in medical diagnosis and treatment – the AI sandbox
In 2021, the Norwegian Data Processing Authority (DPA) established a regulatory test environment (“sandbox”) for Artificial Intelligence (AI). The purpose of the sandbox is that companies and government agencies can collaborate with the DPA when developing and testing AI based technology which is likely to have privacy implications. The DPA will provide guidance during the development and testing phases, to ensure that the resulting AI tools will be responsible and in compliance with personal data processing rules.

The Bergen hospital project
In a recently completed sandbox project, the DPA collaborated with public hospitals in Bergen in connection with the development of an AI-based diagnostic tool. The Bergen hospitals had experienced that 10 % of their patients are responsible for more than half (53 %) of the total number of bed days, and that these patients are frequently readmitted to the hospital within 30 days of their previous discharge. The hospital wanted to develop technology which, based on an algorithm and machine learning, could predict which of their patients that were likely to be readmitted, based on analysis of data from the individual patient’s medical records.
Earlier medical studies had shown that it was possible, based on routine data, to predict which patients had the highest risk for readmission. Such earlier models had proven to be accurate on a group level, but not on an individual level. The purpose of the project in Bergen was to develop technology that would apply such predictions to the individual patient, thereby enabling the medical professionals to provide additional medical treatment to the patients who needed it, in order to avoid readmission. This would in turn improve the individual health care received by these patients, as well as save on hospital resources.
In step 1, Helse Bergen developed an AI based system which would analyze data obtained from the patient’s digital medical records and provide a warning if there was an increased risk of readmission. This warning would be provided to the medical staff upon admission of the patient. The development of the algorithm was carried out using actual patient data.
In step 2, the medical personnel involved in the treatment of the patient would use the prediction and their medical know-how to issue a patient score and implement preemptive measures for the patients where this was deemed necessary.
The DPIA prior to the project
Prior to the project, the participants in the project carried out a Data Protection Impact Assessment (DPIA). The DPIA revealed two specific risks to the fairness of the processing, namely a risk of false negatives and false positives and a risk of demographic bias. Neither of these was found to entail a high risk.
Regarding the false positives, this would simply mean that the patient would receive more extensive treatment than what they otherwise would, which is not a threat to the patients. A false negative would entail that the patient would receive the same treatment as they would have received without the tool, which also cannot be categorized as high risk.
Regarding the risk of demographic bias, the project considered whether this risk could be reduced using more extensive data sets. In machine learning, it is often beneficial to provide as much data as possible, however this was likely to conflict with the principle of data minimization in the GDPR. Using low quality data sets in the training of the algorithm was identified as entailing a risk for demographic bias, which could also result in demographic bias in the finished product. The project found, however, that the accuracy of the algorithm was not dependent on the use of an extensive amount of data pertaining to the patient, but rather that using fewer but carefully selected parameters would provide results that were just as accurate. For example, it turned out that it was only necessary to consider the number of previous diagnoses given to a patient, it was not necessary to include data regarding each specific diagnosis.
Lawfulness of the processing
The development of the tool was based on actual patient records, i.e., medical data relating to identified patients. Such data is considered as personal data and as “special categories” under the GDPR Article 9.
In the consideration of the legal basis for the processing, the project distinguished between processing of personal data for the purpose of developing the technology through machine learning, and the later use of the tool in medical treatment. For the development phase, the DPA found that the processing was lawful under Article 6(1)(e), as the processing was necessary for the performance of a task carried out in the public interest. Further, the DPA found that the processing was necessary for the purpose of preventive medicine under Article 9(2)(h) and also that the processing was necessary for reasons of public interest in the area of public health, Article 9(2)(i). The DPA found basis in Norwegian law for the processing in the Norwegian legislation on health personnel, which after an amendment in 2021 sets out that, following an approval from the Ministry of Health, patient journal data may be used for development of clinical tools and for the purpose of promoting health or improving medical services.
For the later use of the tool, the DPA found that the processing was lawful under Article 6(1)(c), as the processing is necessary for compliance with a legal obligation – the hospitals have an obligation to provide adequate medical services to its patients. The DPA further found that the processing was necessary for the purpose of preventive medicine and for the provision of health care or treatment, Article 9(2)(h). The DPA found basis in Norwegian legislation on specialist medical services, which contains an obligation for the specialist medical service to provide adequate medical services, and also in the health personnel legislation imposing an obligation on medical personnel to keep medical records of the patients’ medical data.
Does use of the tool entail automated individual decision-making?
The project considered whether use of the tool would be in violation of the ban in Article 22 against automated individual decision-making. The conclusion was however that this was not the case, as any decisions regarding the patient were not to be based solely on automated processing, but rather by the doctors and other medical personnel based on input from the system.
Article provided by INPLP member: Øystein Flagstad (Gjessing Reimers, Norway)
Discover more about the INPLP and the INPLP-Members
Dr. Tobias Höllwarth (Managing Director INPLP)
News Archiv
- Alle zeigen
- Mai 2023
- April 2023
- März 2023
- Februar 2023
- Jänner 2023
- Dezember 2022
- November 2022
- Oktober 2022
- September 2022
- August 2022
- Juli 2022
- Mai 2022
- April 2022
- März 2022
- Februar 2022
- November 2021
- September 2021
- Juli 2021
- Mai 2021
- April 2021
- Dezember 2020
- November 2020
- Oktober 2020
- Juni 2020
- März 2020
- Dezember 2019
- Oktober 2019
- September 2019
- August 2019
- Juli 2019
- Juni 2019
- Mai 2019
- April 2019
- März 2019
- Februar 2019
- Jänner 2019
- Dezember 2018
- November 2018
- Oktober 2018
- September 2018
- August 2018
- Juli 2018
- Juni 2018
- Mai 2018
- April 2018
- März 2018
- Februar 2018
- Dezember 2017
- November 2017
- Oktober 2017
- September 2017
- August 2017
- Juli 2017
- Juni 2017
- Mai 2017
- April 2017
- März 2017
- Februar 2017
- November 2016
- Oktober 2016
- September 2016
- Juli 2016
- Juni 2016
- Mai 2016
- April 2016
- März 2016
- Februar 2016
- Jänner 2016
- Dezember 2015
- November 2015
- Oktober 2015
- September 2015
- August 2015
- Juli 2015
- Juni 2015
- Mai 2015
- April 2015
- März 2015
- Februar 2015
- Jänner 2015
- Dezember 2014
- November 2014
- Oktober 2014
- September 2014
- August 2014
- Juli 2014
- Juni 2014
- Mai 2014
- April 2014
- März 2014
- Februar 2014
- Jänner 2014
- Dezember 2013
- November 2013
- Oktober 2013
- September 2013
- August 2013
- Juli 2013
- Juni 2013
- Mai 2013
- April 2013
- März 2013
- Februar 2013
- Jänner 2013
- Dezember 2012
- November 2012
- Oktober 2012
- September 2012
- August 2012
- Juli 2012
- Juni 2012
- Mai 2012
- April 2012
- März 2012
- Februar 2012
- Jänner 2012
- Dezember 2011
- November 2011
- Oktober 2011
- September 2011
- Juli 2011
- Juni 2011
- Mai 2011
- April 2011
- März 2011
- Februar 2011
- Jänner 2011
- November 2010
- Oktober 2010
- September 2010
- Juli 2010