Executive Summary :

The eKasa case concerns proceedings initiated by a group of 33 deputies of the Slovak Parliament, who claimed that the mandatory use of electronic cash registers – who remit data in-real time to the central data warehouse of the Slovak tax administration for further processing by tax risk-scoring algorithms – and the collection of the unique identifier of the buyer (i.e. the end-customers) by the seller, is contrary to Art. 8 ECHR (right to private life), as well as Art. 7 (right to private life), Art. 8 (right to data protection) and Art. 52(1) of the EU Charter (principle of legality), and their corresponding rights in the Slovak constitution. In summary, the claimants argued that the ‘eKasa’ system obliges buyer to remit in real-time considerable amounts of data, on both seller but also on buyers, of which at least some seems irrelevant for tax purposes. Moreover, the data remitted is further processed by external risk-management (risk-scoring) algorithms in ways that are not provided in the law. Hence, have without democratic assent and in contravention of the principle of legality. as such risk-scoring algorithms have been implemented solely on the basis of managerial decisions and without sufficient safeguards against the misuse of data. The Court sided with the claimants and ruled that the legislature, when adopting or extending the use of machine-learning algorithms and automated decision-making systems, must ensure that such system has a normative basis in the law and contain specific safeguards.

Primary issues of the case:

As documented in Slovakia’s country report, the Slovakian tax administration makes use of three different tax algorithms: a chatbot ‘TAXANA’ – unrelated to this case, an ERP system ‘eKasa’, which transfer real-time data to a risk-scoring machine-learning algorithm ‘AIS-R’, which among other data, processes the data transferred by the eKasa system to predict the risk of VAT fraud associated with individual legal entities.

The first issue in this case is that, while the eKasa system is regulated by law, the machine-learning algorithm which processes that data further down the chain is not provided in the law, but was implemented based on a managerial decision of the Slovak tax administration. As shown in our country reports, Slovakia is far from being the only Member State in such situation. Many EU Member States either do not have any specific legislative norms regulating the use of their machine-learning risk-scoring algorithms, or have laws that are neither sufficiently precise nor contain sufficient safeguards to protect taxpayers against misuse of their data or abuse of power. This phenomenon of lack of legality, lack of transparency and lack of verifiable insights afforded to taxpayers, so these can understand how their data is processed, was also demonstrated in the case of Systeem Risico Indicatie (SyRI) in the Netherlands, or in CJEU case SS SIA.

The second issue is that the eKasa system, similarly to most VAT ERP and electronic invoice systems in the EU as in fine VAT is a fairly harmonized form of taxation, mandate the transfer of enormous amounts of data. Furthermore, as stated by the claimants in the eKasa case, the use of risk-scoring algorithms can reveal more connections and more insight about an individual’s private life when that data is grouped and bundled with the data of other taxpayers. A specificity of the eKasa ERP system is that it mandated the issuance of a unique identifier not only for the seller (i.e. the VAT number) which is relatively standard for such system, but also a unique identifier for the buyer (cfr. Law of the 18 June 2008, Art. 2(q)). In other words, if it so desires, the tax administration could have access to the entire purchase history of a customer. One should entertain serious doubts over the relevance of such information in the fight against VAT, as the claimants rightfully did. Accordingly, the use of such system, as useful and as beneficial as they may be for society, should systematically be accompanied by strong safeguards to prevent abuses and misuse of data. This last addition to the eKasa system, seems to be the straw that broke the camel’s back for the claimants, as it is their primary point of contention and argument against the adoption of this specific ERP system.

Ruling of the Court:

The Court sided with the claimants, ruling that the use of the eKasa system and the section on the ‘unique identifier of the buyer’ are not compatible with the principle of legality and the right to privacy and data protection, as enshrined in the Slovak constitution.

The Court ruled that automated assessment of individuals based on their personal data, i.e. external risk-management or risk-scoring algorithms, has adverse impacts on taxpayers and thus cannot be implemented solely on the basis of managerial decisions. The Court continued by stating that such technology poses special risks, because it is not limited to specific individuals suspected of fraud, but concerns large number of individuals with considerable asymmetry of information.

Moreover, the Court making reference to the now famous of Frank Pasquale (The Black Box Society) asserts that because the technology is hard to understand for the addressee, its use may pose a systemic risk to society. Consequently, the use of tax machine-learning systems should be accompanied by ex-ante and ex-post measures to negate the risks to citizens’ fundamental rights, in particular focusing in three areas: i) transparency ii) individual protection iii) collective supervision (§132).

The Court asserted that the use of technology by public administrations cannot result in a State where decisions are inexplicable, unexplained and at the same time no is responsible for them (§127).

Key takeaways:

The first key takeaway, and it may be somewhat conjectural, is that many EU legislatures seem to hide under the ‘white-knight’ justification that because the fight against tax fraud is beneficial to society, they are shielded from the principle of legality and the necessity to accompany investigative processes with sufficient safeguards. Yet, in particular following the case of SyRI or the events of the toeslagenaffaire, this justification is becoming less and less satisfactory. The Slovak Supreme Court confirmed this in its ruling (§125). The Court reminded that automation cannot be used wherever it is technically possible and useful, simply because it will save the administration, and thus taxpayers, some resources. The legislator must ensure the legitimacy and proportionality of its technological solutions proposed. Citizens should not have to trust the legislature that tax machine-learning algorithms are always a net-plus for society, the legislature should have the burden to prove that. In particular by adopting democratically assented laws, and safeguards which negates the risks to citizens’ fundamental rights.

Second, the Court expanded on the notion of automated decision-making to include decision-assistance and situations where the output of the machine-learning algorithm serves as crucial input for the subsequent decision of the authorities on whether to audit or not. In the author’s opinion, this is one of the major pitfalls of Art. 22 GDPR: automated decision-making is defined as a decision based solely on automated processing. A standard which simply put, does not fit with administrative decision-making as it often follows a long chain of command where the input of several human agents is gathered before deciding on whether to audit or not to audit specific taxpayers. After the external risk-management algorithm has pre-selected taxpayers for audit, there may be a number of human decisions, e.g. drafting an audit plan, prioritizing cases, assessing costs. This does not mean that the most important input was the output generated by the machine-learning algorithm, yet as unimportant as they may be for citizens, these human inputs would disqualify the algorithm from being characterized as ‘automated decision-making’. Thus this long chain of command, despite being inconsequential, rendered Art. 22 of the GDPR completely meaningless in tax and administrative decision-making. This may explain why this article was never invoked in the toeslagenaffaire, despite the fact that many victims claim having been told by tax officials ‘you have been targeted because the algorithm said so’.

Third, the Court provided the legislature with a thorough lecture on the safeguards to adopt in the context of algorithmic governance, emphasizing the crucial importance of the principle of legality, transparency and informational self-determination. Moreover, the Court affirms the necessity of collective controls, to negate the risks to taxpayers’ fundamental rights, such as quality assessment, the identification of errors or imperfections, both ex-ante and ex-post. Concluding that it should not be mission of individual taxpayers to correct the flaws of tax machine-learning algorithms.

References:

Jurisprudence:
492 FINDING Of the Constitutional Court of the Slovak Republic, PL. ÚS 25 /2019-117

Legislation:
THE LAW of 18 June 2008 on the use of the electronic cash register and on the amendment of the Act of the Slovak National Council no. 511/1992 Coll.
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC
Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (wp251rev.01)

Doctrine:
Frank Pasquale, The Black Box Society: The secret algorithms that control money and information (Harvard University Press, 2015)
M. Husovec, ‘The Slovak Constitutional Court on Risk Profiling and Automated Decision-Making by the Tax Authority’ (17 December 2021), available at:https://husovec.eu/2021/12/the-slovak-constitutional-court-on-risk-profiling-and-automated-decision-making-by-the-tax-authority/
M. Veale & L. Edwards ‘Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling’ (2018) Computer Law and Security Review, Vol. 34, Issue 2