By Rob van den Hoven van Genderen
On 5 February 2020, the Court of The Hague ruled that SyRI (System Risk Indication) legislation is contrary to the European Convention on Human Rights (ECHR).1 This case was brought by a large number of civil society organizations against the use by the State of the Netherlands to detect and combat fraud in a number of ‘risk areas’ with the help of data linking and analysis using algorithms. The court ruled that there was insufficient balance between the use of new technologies such as AI, data analysis, algorithms, deep learning or self-learning systems – and respect for private life as set out in article 8 of the ECHR. According to the court, there is also a risk of discrimination. The law is insufficiently transparent and verifiable and therefore unlawful.
Tijmen Wisman of the Civil Protection Platform says about the verdict: “We have been proved right on all major fundamental points today. This is a timely victory for the legal protection of citizens in the Netherlands.”
The verdict seems revolutionary for the protection of privacy but walks on two thoughts. On the one hand there must be an interest for the state to tackle fraud and new technology such as the use of algorithms may be used for that purpose, on the other hand there must be a balance between the breach of privacy and the result. That result was not present here, according to the Court and moreover, encouraged discrimination.
The Court states that the member states – in this case the Netherlands – have a special responsibility with regard to new technologies, but does not exclude the use of this type of technology.
The Court compared the objectives of the SyRI legislation, namely to prevent and combat fraud in the interest of economic well-being, with the violation of the private life that the legislation makes. According to the court, the legislation does not meet the ‘fair balance’ that the ECHR requires in order to be able to speak about a sufficiently justified breach of privacy. Regarding the deployment of SyRI, the legislation is insufficiently clear and verifiable. The legislation is unlawful because it violates higher law (European Convention of Human Rights) and is therefore non-binding.
The data that may be examined on the basis of this legislation are particularly sensitive. Among the series mentioned, I mention some striking personal information. In addition to the known identity data, education, integration and reintegration (bias!), compliance with regulations (social credit!), insurances, licenses and permissions, but also water and energy use and other life data.
Pursuant to article 8 of the ECHR, the Netherlands as a member state has a special responsibility for the
application of new technologies. This involves the right balance in weighing on the one hand the benefits associated with the use of those technologies and on the other hand the interference that such use can make on the right to respect for private life. This also applies to the deployment of SyRI.
Not just the law as such is wrong but also the ‘black box’ aspects of the system:
- The risk model, the indicators and the data used are not public and not known;
- The so called objectively verifiable information, among other things, is not explained;
- No information is provided that does not relate to the operation of the risk model, for example about the type of algorithms, nor information about the method of risk analysis;
- It cannot be checked how the result from analysis is created and what steps it consists of;
- The transparency, with a view to verifiability, is unclear and therefore the analysis involves the risk of discriminatory effects in the decision making.
Interestingly, the Dutch Data Protection Authority (hereinafter: AP, formerly the Dutch Data Protection Authority) has been designated as a supervisory authority in the Netherlands within the meaning of the GDPR and as an external privacy regulator it supervises compliance with SyRI legislation, among other things but did not intervene in an earlier stage.
Although, in 2014, they already stated in an advice for the implementation of SyRI “select before you collect”, meaning that no more personal data may be collected than is actually necessary and that personal data must also be handled as simply as possible. It is not proportional to collect more personal data than is necessary for the realization of the purpose of data processing. Therefore, the collection of personal data must be kept to a minimum. To do justice to the principle of select before you collect in the context of SyRI, the following two
requirements are important:
1. The indicators must be objectifiable. Facts, circumstances and/or statistical data must make it plausible that an indicator points to the (possible) presence of fraud or misuse of public funds. In other words: an indicator must be properly substantiated.
2. The principle of select before you collect in the context of a SyRI project means that the persons from whom information is provided are designated in advance.2
“The verdict seems revolutionary for the protection of privacy but walks on two thoughts.”
Furthermore, the regulator states that the person processing personal data is under the obligation to avoid or limit the infringement of others’ privacy within reasonable limits. One should refrain from processing personal data if the same purpose can also be achieved by other means and with less radical means.
Although not stated in so many words, this can also be understood to mean that intrusive technology cannot be used in this sense automatically and without in-depth consideration and that this is therefore contrary to the subsidiarity principle.
Incidentally, it is striking that prior to this court case no involvement of the AP can be established.
There probably will be more cases like this in the near future, because there is an increasing use by government, specially justice, tax and (social) security of intrusive technologies as face recognition technology (FRT) and data analysis by algorithms. This also considers public available data and governmental databases.
As response on a request for information which FRT and algorithms are used by The Ministry of Security and Justice I got the answer that this was to specialized to receive a direct response and I will be informed about it when they have contacted the specialist within the Ministry on the subject. I am still waiting…
About the author
Robert van den Hoven van Genderen is professor AI Robotlaw at the University of Lapland, director of the Centre for Law and Internet at the Law Faculty of the Vrije Universiteit and president of the Netherlands Association for AI Robotlaw. Before his academic positions he worked a.o. as director Regulatory affairs in the Telecommunications industry.
1 Rb. Den Haag 5 februari 2020, ECLI:NL:RBDHA:2020:865.
2 Conceptbesluit SyRI, Briefadvies aan de Minister van Sociale Zaken en Werkgelegenheid, Den Haag 8 februari 2014.