1 Introduction
The use of artificial intelligence (AI) can improve the various administrative processes carried out by tax authorities. However, it is important to address the risks that the use of these technologies may entail. In this regard, ESEVERRI has predicted that the emergence of AI in tax procedures will lead to a reduction in the rights and guarantees of taxpayers. This paper will analyse to what extent the use of this technology can affect such fundamental rights.
The fact that AI may give rise to certain threats to the fundamental freedoms of taxpayers should not result in its use being banned. Tax authorities cannot do without this and other computer technologies for the development of various tasks that require massive data processing, as their use can greatly improve various administrative functions. However, it is necessary to establish measures that ensure a responsible, safe and ethical use of AI, with the protection of taxpayers’ rights being one of the fundamental aspects to achieve this objective. This vision has recently been shared by the Spanish tax administration itself, since in the document “AI Strategy of the Tax Agency (hereinafter, AEAT), dated May 27, 2024” , these principles are repeatedly referred to within the framework of the strategy proposed for the development and use of AI .
However, it must be kept in mind that there is currently no regulation that specifically regulates the development and use of this technology by tax administrations. The EU Regulation on AI does not affect all human activities equally and will have little effect in relation to the systems used by tax and customs administrations in their administrative procedures, insofar as they are excluded from the high-risk classification . This means that the strictest rules corresponding to this type of system will not be applied , but only certain transparency rules, which will not contribute in a relevant way to the improvement of taxpayers’ rights . There are also no national regulations that refer to the application of these technologies in the tax field.
It is essential to overcome this regulatory gap, for which the appropriate legal framework and preventive mechanisms must be established to promote a responsible and transparent development and use of these technologies in tax procedures. Priority must be given to an ethical use of AI, based on due respect for the rights and guarantees of taxpayers, as well as on the new rights that must be strengthened and consolidated for the protection of citizens in the digital age. The challenge for lawyers is to propose regulatory reforms that comply with the indicated principles.
In any case, in the absence of specific regulation, the application of these technologies is carried out within the framework of the existing legal system, which establishes the principles on which administrative actions must be based. In addition, these must respect, in any case, the rights and guarantees of the citizens affected by them. Next, it will be the subject of study how such principles and rights can operate as a limit to tax administrations that use tools based on AI and mass data processing. Taxpayers affected by these administrative actions enjoy all the rights that assist them in order to their legal defense, so that in their appeals against the administrative acts that affect them, they will be able to refer to all those issues that have been raised throughout those acts, including those related to the use of AI.
2 The principles that govern the actions of tax administrations as a guarantee of the proper use of AI
In the deployment and use of AI tools in tax procedures, tax administrations must comply with all the principles that inspire and order administrative action. GARCÍA-HERRERA points out that administrations that adopt big data and AI technologies must be governed by the principles of prudence, non-discrimination, proportionality, transparency or data governance. Space limitations prevent us from undertaking a comprehensive study of this issue, so below we will limit ourselves to analyzing in greater depth only some of the principles that are basic for these purposes, such as proportionality, good administration, transparency and legal certainty.
2.1 The principle of proportionality
The principle of proportionality is included among the principles that inform the administrative procedure, being basic in the organization of powers. It plays a role of enormous relevance, since it must guide the legislator in regulating the procedures for applying taxes and must guide in the use of the regulatory power. It is also recognized as a fundamental principle in the application of the tax system by article 3.2 of Law 58/2003, of December 17, General Taxation (hereinafter, LGT). In addition, it is a principle that is continually used by national or European jurisprudence as a means of controlling the activities of the legislator and the Administration. In this way, it is fully operational as a means of controlling the activity of tax administrations, which must conform as far as possible to the criteria that are deduced from it. This principle affects all actions and acts of application of the tax system, since it determines the way in which the relationship between power and fundamental rights must develop. Beyond a limit to the different powers, it reflects the way in which the application of taxes must be understood in a democratic society, and therefore must govern all decision-making processes within the framework of tax procedures. Any data processing must pass through the sieve of the principle of proportionality, which requires respect for the three criteria that comprise it: suitability, necessity and weighting of the measure or principle of proportionality in the strict sense.
.Firstly, this principle assumes that the means used in administrative actions are appropriate to the purposes that justify them. Although the objective pursued by them may be perfectly lawful, the means used to achieve it may not be, as they are not appropriate for that purpose. Therefore, it must be justified that the use of the technological tools used and the processing of data is appropriate to the purposes pursued. In addition, all those aspects that are essential within the framework of the different administrative procedures must be taken into account so that the suitability of the acts and administrative actions can be controlled. To this end, there are certain procedures that are absolutely essential, such as the submission of allegations by the interested party and/or the hearing procedure, which are regulated in letters l) and m) of article 34.1 of the LGT. However, the essential aspect for carrying out this suitability control is the motivation of the acts, since its absence or insufficiency determines the defenselessness of the taxpayers. Precisely, this is one of the problems that procedures that are resolved automatically by any technology based on algorithms have, since reality demonstrates the deficiencies in the motivation of decisional computing. The motivation must be individualized, taking into account the conditions of the specific case, so a document that incorporates stereotyped phrases that are repeated in all decisions does not exceed this requirement. The motivation of an AI system can hardly reach sufficient legal richness, particularly when it involves the exercise of discretionary powers or when the administrative decision depends on the legal interpretation of a normative text, since AI does not yet have the legal reasoning capacity of human beings. Therefore, in this type of situation, the Administration is exposed to many tax acts being annulled due to lack or incorrect motivation. However, in different documents the AEAT has indicated that it does not use AI systems to make automated decisions, so as long as this reality does not change, the problem indicated will not arise.
Secondly, the need for data processing using AI systems must be justified, since the principle of minimum intervention is an essential component of the principle of proportionality. It is a principle that has been recognised for years in administrative and tax regulations, as well as in the jurisprudence of the Supreme Court (hereinafter, TS), which elevated it to the category of general principle applicable to all public administrations. The principle requires that, when there is a choice between several appropriate measures to achieve the same objective, the least burdensome must be used, since the least onerous possible administrative actions must be sought. This principle is intended to prevent the situation of taxpayers from being unnecessarily aggravated. Therefore, if a less radical solution exists to achieve the same goal, that is, a measure that involves less restriction of individual rights and freedoms, it will be preferred. The measure adopted in contradiction with this rule violates the principle of proportionality, so those administrative actions that are, in relation to fundamental rights, more restrictive than other possible alternative solutions may be annulled, provided that these are of similar effectiveness. The most restrictive measures should only be adopted when the other options that are more respectful of the rights of taxpayers have failed. This principle is of great relevance when it comes to the establishment and fulfillment of formal duties or obligations, since it is necessary to limit the costs of compliance with this type of duties for taxpayers and that the actions of the tax administrations that require their intervention in such procedures are carried out in the least burdensome way possible. This principle requires that the burdens imposed on taxpayers are not disproportionate in relation to the advantages they bring to the public good. Therefore, the incorporation of new AI technologies into tax procedures cannot lead to greater burdens for taxpayers, but rather the opposite, to a reduction in the burden.
Finally, it will be necessary to take into account the principle of proportionality in the strict sense, according to the name of the Constitutional Court. This principle is aimed at avoiding that, as a consequence of administrative activity, greater damages arise than those intended to be avoided, by weighing up the benefits derived from the measure adopted and the restrictions of the rights to which it gives rise. In this sense, the LGT establishes various precepts that influence this idea. They are norms that refer to procedures that are especially burdensome in themselves, such as the procedure for forced execution, the adoption of precautionary measures or the suspension of administrative acts in the economic-administrative route, as suspension without guarantees is admitted in certain situations. In short, mere formal compliance with the norms is not enough, but it is necessary for the Administration to restrict the rights of taxpayers as little as possible in its ordinary way of acting.
In the area under analysis, the principle of proportionality determines, among other things, that unjustified and disproportionate interference in the personal sphere of citizens is not valid, for example, through the use of personal data that do not have a minimum tax significance or that are not related to the specific procedure for which they have been collected, which also refers to the principle of the relevance and tax significance of the same, which derives from the right to data protection.
If the tax authorities were to take all these principles and criteria into account, the quality of administrative interventions would improve, greater respect for the rights of taxpayers would be achieved and compliance costs for them would be limited and, of course, the objectives pursued would be achieved with the same effectiveness. However, we are still a long way from this happening.
2.2 The principle of good administration
Another guiding principle of administrative action that must take center stage in the process of incorporating AI into tax procedures is the principle of good administration. The relevance of this principle has been recognized by the most recent jurisprudence of the Supreme Court, as well as by the CJEU in numerous matters affecting the tax field. This principle is based on article 41 of the Charter of Fundamental Rights of the European Union and, at an internal level, can be derived from articles 9.3 and 103 of the Spanish Constitution and the general principles included in article 3.1.e) of Law 40/2015, of October 1, on the Legal Regime of the Public Sector. The Supreme Court has argued that the Administration is required to conduct itself diligently to avoid possible dysfunctions arising from its actions. Under this principle, the administrations must not only strictly comply with the procedural rules, observing all the procedures provided for therein, but must also ensure the full effectiveness of the guarantees and rights recognized to taxpayers in the Spanish Constitution, in the laws and in other development regulations. In order to truly and effectively guarantee the rights of taxpayers, the tax administrations are required to behave objectively and diligently, through a proactive attitude. Jurisprudence also highlights that this principle constitutes, according to the best doctrine, a new paradigm of 21st century law, referring to a mode of public action that excludes negligent management.
This principle gives rise to a series of rights for citizens, as well as a corresponding list of duties that public administrations must fulfil, which must act, in the exercise of their functions, in a fair, reasonable, equitable, impartial and transparent manner, and for the benefit of citizens, respecting their rights and interests, treating them all equally, without discrimination. This is a principle that allows the rights and guarantees of citizens to be strengthened in administrative procedures. It is related to the fundamental right to a fair administrative procedure, without disproportionate delays, as well as to other rights and guarantees that may derive from it, such as the principles of transparency, impartiality, access to justice and effective administrative protection, the right to information, to participation, to be received in a hearing, to obtain an administrative resolution within a reasonable period of time, to receive effective and equitable treatment of matters, to the Administration acting in good faith and to the motivation of administrative decisions.
In relation to the subject matter under study, relevant consequences can be drawn from this principle, although it is true that some of them can also be derived from other principles governing the tax procedure and from some of the rights of taxpayers that will be examined later. Thus, this principle requires transparency and publicity in the use of AI. It also implies the need for control of the appropriate use of this technology so that the results derived from it are correct, for which all the processes aimed at this end must be implemented. Thus, a protocol of action and a guide to good practices must be established, in which the principles and procedures to be followed to guarantee an adequate development of the systems and their good use by officials are indicated. For their part, internal controls or audits must be carried out, which guarantee compliance with such principles and procedures and, in particular, control of the biases and deviations that these tools may produce. It would also be advisable to periodically carry out external audits to ensure the proper functioning of the systems and to highlight any improvements that may be introduced, as well as to establish a quality seal for the algorithms. Another aspect that this principle affects is the obligation of the Administration to sufficiently motivate tax acts, since the citizen has the right to have decisions based on AI duly explained so that the subject can take the defensive actions that the system offers. In addition, it would also require avoiding unnecessary and undue delay in the recognition of the rights that are requested, so this principle could be invoked to correct those cases in which the delay in the resolution derives from an automated decision made based on an algorithm. Therefore, it is a guiding principle that should inspire administrative actions in tax procedures, but it is also a principle that can be invoked in the form of an appeal or claim when it is considered that it has been violated by an administrative action or resolution in any procedure for the application of taxes, which allows justice to be done when there is no flagrant violation of the regulations, which is what can happen due to the use of AI by tax administrations, given the existing deregulation.
2.3 The principle of transparency
Article 3.1.c) of Law 40/2015, of October 1, on the Legal Regime of the Public Sector, includes among the general principles that public administrations must respect the principle of “transparency of administrative action”. Law 19/2013, of December 9, on transparency, access to public information and good governance, also constantly refers to this principle. It is, therefore, a fundamental principle of the action of the public authorities that affects all public administrations.
The duty of transparency must be specified in various rights that correspond to the taxpayers affected by tax procedures in which AI-based systems are used. Without being exhaustive, we can cite some of these rights. First, the taxpayer’s right to know whether an AI system has been used in the procedure and the scope of such use must be recognized, since the appeal that the taxpayer files against an act may be based on some issue related to the tool used. Second, the right of citizens to know the computer applications used by tax administrations must be established, which must be specified
Thirdly, if an act has been adopted directly by the system, this circumstance must be specified in the automated decision that ends the procedure, which is already an obligation of the Administration in some countries around us . Although it has been suggested that this duty could be based on an extensive interpretation of letter ñ) of article 34.1 of the LGT , the truth is that it is difficult to force the administrations to comply with it, given the broad terms of this provision, so an express provision in this sense would be necessary for it to be applied in a mandatory manner by the tax administrations. In a later section, the new rights and guarantees for the protection of taxpayers affected by the use of AI by these administrations will be analysed in more detail, which will be redirected to the principle of transparency, examining the pillars on which said principle should be based for these purposes.
2.4 The principle of legal certainty
The principle of legal certainty is also applicable in this matter. The scarce regulation of the same, together with the existing opacity regarding the use of these technologies, determines that there is an almost total absence of legal certainty. In this sense, it has been considered that the lack of transparency in relation to the use of AI could constitute a violation of the principle of legal certainty in its manifestation of prohibition of arbitrariness. However, it is very difficult to articulate an appeal or claim based solely on this legal principle, since it is difficult for a court to annul an administrative act for contravening it. In any case, as has already been indicated on different occasions, it would be highly advisable for the legislator to establish the appropriate legal framework that allows compliance with the principles analyzed in this work.
3 Traditional fundamental rights as a guarantee against the use of AI and mass data processing
Fundamental rights constitute a guarantee against the use of AI by tax administrations. In the absence of legal regulation that develops this matter, fundamental rights constitute an insurmountable limit to administrative action. It has been highlighted that, given the misgivings and distrust generated by the use of AI by tax administrations, they must be extremely careful in safeguarding taxpayers’ rights. It is true that these rights would be more effective against possible abuses if there were legislative development of this matter, which would allow the necessary adaptations to be made to guarantee the protection of taxpayers in this new technological context. Until this task is carried out, legal operators must be especially vigilant to ensure that tax administrations respect the essential content of such rights.
In particular, although they are not the only rights that may be affected, those that, in our opinion, may be most relevant for these purposes will be analyzed below, such as the right to personal and family privacy, the right to non-discrimination or equality in the application of the law and the right to the protection of personal data. Without being exhaustive, we will try to analyse some of the aspects that may affect the rights mentioned and that could even lead to their violation. As the contentious issues arise and the specific cases that arise are resolved by the courts, the effectiveness of these rights in eradicating inappropriate behaviour by the tax authorities can be verified.
3.1 The right to personal and family privacy
The right to personal and family privacy may be violated as a result of the processing of data that, although they may have an indirect patrimonial content, refer to the personal sphere of the obliged party. This situation may occur, fundamentally, when data is collected from open sources indiscriminately, without control of the tax significance of the same for a specific investigation already initiated. The data collection phase cannot be prospective, as this would lead the tax authorities to obtain all kinds of data in order to subsequently select the most relevant data in the processing phase. This situation may arise, for example, in an attempt to prove a person’s tax residence in Spain, as a large amount of information can be obtained from publications made on social networks, such that certain information can be handled that is not of tax interest and that may affect the private life of the person under investigation.
3.2 The right to non-discrimination and equality in the application of the law
The use of AI by tax authorities could reinforce equality, helping to make tax management fairer, since administrative decisions can be more objective, as they depend less on the subjectivity of officials, to the extent that this technology allows the system to respond identically to the same factual situation. In addition, greater effectiveness and efficiency can be achieved in the use of public resources. However, applications are also imperfect and can lead to inadequate results, which generates risks that endanger equality, as they can lead to biases that result in discriminatory situations, which will occur when the use of these systems can lead to harmful treatment of a person or group of people, without justification, which would violate the traditional principle of non-discrimination. In any case, the threat is not the technology, but the misuse of it.
Numerous authors have agreed that the right that may be most seriously affected by the use of AI is the right to non-discrimination and, more specifically, the right to equality in the application of the law. Although biases are inherent to any human activity, the use of big data and AI applications can increase these types of situations, because they carry out massive data processing, such that the number of people who may be affected by these applications is very high. To prevent these cases from occurring, variables that give rise to discriminatory biases must not be introduced, since the model must be neutral in relation to certain characteristics of the human being, such as sex, race, religion, political ideology, age, income level, social class, nationality, etc. When factors of this type are used, despite their legal irrelevance for the resolution of the procedure, unjustified biases may occur in the decisions that are adopted, violating the principles of equality and non-discrimination. The same consequence would occur if the opening of the verification procedures had its origin in some of the parameters stated, so guarantees must be introduced to eliminate biases in the data entered into AI systems, even if this means less effectiveness in the pursuit of tax fraud.
On the other hand, biases can have their origin in different technical aspects necessary for the operation of the systems. Firstly, the discriminatory situation can have its origin in the algorithms used by the applications, which has led to the enunciation of a principle of “non-algorithmic discrimination”. Secondly, the violation of this right can also have its origin in the data that has been used to train the system, fundamentally when the cases that have been used for this purpose have been generated by AI, since in this case the erroneous results derived from the tool can be amplified. In particular, these discriminatory biases can occur in the systems used to select taxpayers who are to be subject to a tax control procedure, which will lead to certain types of operations, sectors, persons or groups being investigated on a recurring basis, without there being a reason for this.
3.3 The right to the protection of personal data
The right to the protection of personal data is one of the most relevant rights involved in the use of AI by tax administrations, being a condition for the protection of other fundamental rights. In fact, it has been highlighted that currently the only existing legal guarantees in this matter are those established in the regulations on data protection. These regulations are Organic Law 3/2018, of December 5, on the Protection of Personal Data and guarantee of digital rights, as well as the Regulation of the European Parliament and Council of April 27, 2016 (2016/679). However, it must be taken into account that these guarantees cover only part of the problem raised, since it is one of the many issues involved in the use of AI by administrations. If personal data is processed through this type of tool, the principles established by these regulations must be respected. However, it must be kept in mind that the 14th additional provision of Organic Law 3/2018 allows the survival of the regulations that establish exceptions and limitations in the exercise of rights that were in force before the GDPR. This means that articles 23 and 24 of Organic Law 15/1999, of December 13, on the Protection of Personal Data, remain in force.
Personal Data, as long as they are not expressly modified, replaced or repealed. These rules allow tax authorities to deny the exercise of the rights recognized in articles 15 to 22 of the GDPR, in the event that administrative actions aimed at ensuring compliance with tax obligations are hindered and, in any case, when the affected party is being subject to inspection actions. Therefore, the analysis of the implications of AI on this right must be carried out taking this starting point into account.
For the purposes of personal data protection, the principles set out in the aforementioned regulations must be complied with, such as transparency, relevance, quality, veracity and control of the data that is collected. The principle of transparency requires the application of the rules contained in articles 12 to 14 of the GDPR, and, in addition, that a Register of processing activities be drawn up, in accordance with article 30 of the GDPR. On the other hand, the regulations establish that personal data may only be used for a purpose that has been previously determined, which requires that the data be relevant for that purpose, such that data that is not necessary may not be processed, that is, that does not have tax significance in the specific procedure for which it is collected. Since AI tools allow for massive data processing, there is a risk that a much larger amount of data than is needed will be used, when the processing must be done with the smallest amount of personal data that allows the purpose of the same to be fulfilled. Furthermore, there must be control over the storage of this information with tax implications, since it can only be kept and processed for the specific procedure for which it was obtained and for a limited period of time, so it should be destroyed once the resolution that ends it has become final. In short, data cannot be kept indefinitely in a prospective manner, in case it could be useful in a future investigation into the same taxpayer or another person related to him. All these principles can be jeopardized by the indiscriminate collection of data from a taxpayer, particularly when it is done from open sources, since it is easy for personal data without tax implications and that have no direct relationship with the procedure for which it was collected to be compromised. In addition, the extraction of data from this type of sources, particularly from social networks, jeopardizes the accuracy and veracity of the data, since the data is not verified, so its processing can lead to incorrect results. With regard to information control, tax authorities must have in place all the technical and organisational measures to ensure the security of the data stored or processed, preventing unauthorised communication or access to it. In short, all mechanisms that allow for adequate governance of this data must be implemented. These measures must be reviewed periodically, taking into account the state of the technology, the nature of the data and the risks to which they are exposed.
Furthermore, it is very important to determine whether the provisions of article 22.1 of the GDPR, which establishes a right not to be subject to fully automated decisions, including profiling, are applicable for these purposes. However, section 2 of the same provision allows this type of decision in certain cases. Thus, letter b) allows it when the decision “is authorised by Union or Member State law”. Therefore, when it is expressly provided for by national regulations, as is the case in the LGT, the taxpayer may not object to receiving a fully automated decision as a way of ending the procedures. We agree with PÉREZ BERNABEU that taxpayers may not invoke the aforementioned right when it is a semi-automated decision, that is, when it derives from a decision-making support tool. Finally, another relevant aspect is the possibility for tax authorities to establish risk profiles in an automated manner for the purposes of classifying or segmenting taxpayers, which may give rise to significant consequences for their legal situation. These taxpayer segmentation activities can be developed both in the area of tax control, so that inspections focus on taxpayers who have a high risk of having defrauded, and in the area of collection, since debtors could be classified according to the risk of late payment, so that, for example, those who do not comply with payment obligations on time could have more difficulties in accessing a deferral or installment payment. It must be analyzed whether the right of citizens in relation to the creation of profiles established in article 22.1 of the GDPR is applicable in the tax area. These types of tools are not provided for in the regulations, so the conditions of the exception provided for in article 22.2.b) of the GDPR do not apply, although articles 23 and 24 of Organic Law 15/1999 allow tax administrations to deny the rights derived from this article 22.1 of the GDPR when they hinder administrative actions aimed at ensuring compliance with tax obligations and, in particular, in the event that inspection actions are carried out.
5 Conclusions
Neither the European nor the Spanish regulations establish specific rules that regulate the use of AI by tax administrations. The EU Regulation excludes applications used by said administrations from the high-risk classification, so no special precautions are established in relation to them, so that the impact of this regulation in this area will be minimal. For its part, national tax regulations do not refer to AI, although the tools based on them must comply with the rules that regulate the use of computer applications, in addition to respecting the general legal framework, in which there is already some reference to the use of AI by public administrations.
Therefore, although there are no regulations that refer to this matter specifically, taxpayers are not left to their own devices, since the use of AI must be carried out within the existing legal framework and, in particular, the different principles that govern administrative action must be kept in mind, such as proportionality, good administration, transparency and legal certainty, whose application in this area has been the subject of study in this work. In addition, taxpayers are holders of numerous rights and guarantees, which apply at any stage of the tax procedure. In this sense, a central part of this article has consisted of the analysis of the risks and threats posed by the use of AI and the massive processing of data by tax administrations in relation to some of the most relevant fundamental rights, such as the rights to personal and family privacy, to non-discrimination and equality in the application of the law or to the protection of personal data. All these rights apply in relation to the use of new information technology, regardless of whether the act has been issued automatically through them or whether they have been used to support the decision-making of officials.
Taxpayers are also entitled to other new generation rights that are beginning to be applied in the digital field, such as the right to “algorithmic explainability”. However, we have concluded that this right can be traced back to the principle of transparency and the need for motivation in natural language for administrative acts, without requiring tax administrations to fully publicise the architecture of AI-based tools, so they do not have to publish the source code and the algorithms used by the programmes, although it is also necessary to overcome the current situation of opacity. To this end, the elements in relation to which greater transparency should be required have been pointed out.
In any case, if our study leaves any evidence, it is that it is necessary to establish a regulatory framework that ensures the proper use of these new tools in the actions carried out by the tax administrations. The legal regime that regulates the use of these applications is yet to be built, as it is essential to strengthen control and limits on the development and use of these applications, as well as to achieve greater transparency in the use of this technology, particularly in relation to certain aspects pointed out in this work. And most importantly, the rights of citizens who may be affected by the use of AI must be guaranteed, for which a comprehensive legislative reform will be necessary that addresses specific regulation of this matter. It is up to the jurists to demand that the regulations evolve in order to establish the mechanisms that allow for better protection of the rights and guarantees of taxpayers who may be affected by the use of these applications by the tax administrations.