The General Data Protection Regulation (GDPR) was adopted by the European Union in 2016 and entered into force on May 25 of this year. It strengthened the protection of personal information of EU citizens and has extraterritorial effect, that is, its requirements apply not only to companies registered in the EU but also to those that process personal data of the EU residents and citizens, regardless of their location.

Terminology Difficulties

The GDPR is designed to protect personal data, and the concept of personal data has an expansive definition. In fact, it is any information that is associated with an identifiable individual (the latter in this case is the subject of personal data). At the same time, such information includes encrypted data and data of persons hiding under a pseudonym.

A small loophole and indulgence are that the requirements for the protection of personal data do not apply to the data of anonymous users. The scope of application of completely anonymous services, however, is too limited. Although blockchain and cryptocurrencies continue to be associated with increased privacy and the protection of financial information from prying eyes, many significant players in the crypto space have taken the path of compliance and regulatory transparency. Such large exchanges as Coinbase, Poloniex, Kraken, and Gemini consistently build relationships with the U.S. Securities and Exchange Commission (SEC) and receive broker trading licenses, which, on the one hand, allow them to expand the list of trading instruments and generally not to “fall into disfavor” as illegal, but on the other hand, involve the disclosure of user information.

Regardless of jurisdiction—be it SEC, FINRA, or European institutions—registration requires the company to comply with the KYC (know your customer) and AML (anti-money laundering) policies. Both of these tools came to the crypto industry from traditional banking and stock exchange regulation. As part of KYC, a financial company must establish the identity of its client and verify specific information, including personal data, the likelihood of being involved in illegal activities, and the legitimacy of funds in the account.

With regard to AML, in April 2018, the EU Parliament approved new measures to combat money laundering, which will apply to the cryptocurrency sector. The rules state that crypto exchanges will be required to verify all users from traders to suppliers of cryptocurrency wallets. That is, everyone who provides blockchain services must register. Thanks to the total control of all market participants, parliamentarians “will put an end to anonymity, which is associated with virtual currencies and exchanges,” the report says.

In such conditions, it is impossible to offer a popular cryptocurrency service that will be in demand and at the same time completely anonymous. Consequently, players of the crypto space wishing to survive will have to make friends with the rules of the GDPR.

Namely

The two basic principles implied by the GDPR are accountability and personal data protection.

Accountability in the processing of personal data implies that the persons involved in data processing are divided into “controlling parties” (controllers) and “processing parties” (processors). Controllers are those who define the “goals and means” of data processing. Processors are those who follow the instructions of the controlling party and process the data on its behalf.

The GDPR requires that these roles be allocated in advance, and then the parties must conclude an agreement in which their responsibilities will be spelled out.

As for the second principle of data protection, it is implemented in the GDPR through the transfer of certain rights to the data subject (that is, the user, the real owner of their personal information). Thus, the data subject has the right to:

 Request access to information about one’s self, which is available to the company;

 Ask for correct information if it is incorrect;

 Require that certain information be removed if one does not want the company to own it.

The second and third paragraph of the section on protection conflict with the nature of the blockchain, namely with its immutability.

In fact, as explained by Dave Michels, a researcher with the Microsoft Cloud Computing Research Centre and the Cloud Legal Project at the Queen Mary University of London, “Blockchain data are not really immutable—they’re just hard to change. Collectively, the nodes control all copies of the blockchain. They can change the data stored on the chain by moving to a new version, called ‘forking.’”

Michaels examines the process of adhering to the GDPR with a specific example. He proposes to submit a hypothetical blockchain, which will help verify the authenticity of information in a resume, namely, a scientific degree. Suppose a group of universities has developed such a blockchain. Each university owns a private key, and each degree that a university assigns to its students is entered into the blockchain and attached to that university. Accordingly, the employer can see this information on the chain and verify the authenticity of the information specified in the resume.

At the same time, academic degrees contain personal information. Accordingly, the creators of the blockchain should provide such a structure that would not violate the requirements of the GDPR to protect data. Developers have two options: create an open blockchain (public) or a closed one (private).

In the first case, any person can download the software and run it on their device. In this case, the device that stores the current version of the blockchain will be a node and will be included in the network of other similar nodes. This is the principle Bitcoin works on. The more nodes, the safer the system, the more resistant it is to centralization, the attack of 51%, and other double-spending attacks. This scenario, however, is very complex in terms of meeting the requirements of the GDPR.

Accountability Aspect

In this case, the problem is that the universities that developed this blockchain do not necessarily process personal data themselves. All the network nodes do this. But the nodes do not control the network.

It turns into an intricate distribution of roles, in which it is difficult to find the “responsible,” the controlling and processing sides as required by the GDPR. Universities cannot be considered the controlling party. Michaels draws an analogy with the restaurant. If one imagines the controlling party as the restaurant’s chef, then universities do not fit this role. Rather, they “publish a book of recipes that everyone can cook at home with.”

With such a system, it is not at all clear how to determine who controls and who processes, and how to conclude an agreement on obligations necessary for the GDPR between them.

Aspect of the Protection of the Rights of the Subject of Personal Data

“Suppose I no longer want my degree to be stored on the blockchain. How will universities satisfy my removal request?” Michaels writes.

To do this, they will have to convince each node to remove this information from their local copy of the blockchain. And even if all nodes agree to do this, deleting data from a specific block changes the hash of that block. This will confuse hash pointers that link blocks into a chain.

And here, Michaels comes up to the second option, which the developers of the blockchain had, namely: the creation of a private blockchain, which would make compliance with the GDPR in practice much simpler and more realistic.

In this case, only the developers themselves (or an approved circle of people) will control the blockchain. They will manage the network nodes that can be run on their own devices or a rented place in the cloud. In this case, the nodes will most likely be much smaller, but there will be better communication and coordination between them.

Accountability Aspect

With a private system, universities set up a blockchain and manage it together; respectively, they can be considered the controlling party. The cloud service provider (if any) can be viewed as the processing party because it processes the data (provides its computing power) on behalf of the universities. Universities and cloud provider sign a commitment agreement. Thus, this GDPR requirement for accountability and a clear distribution of roles is met.

Aspect of the Protection of the Rights of the Subject of Personal Data

There are three main actions that blockchains should be allowed to perform to meet the requirements of a GDPR on this issue:

 Search for all examples of the use of personal data relating to a specific individual;

 Extract data and provide it to an individual in a portable format;

 Modify or delete data requested by the individual.

The last point naturally represents the greatest problem.

How to Delete Data from the Blockchain

If all universities agree, they can remove certain data from the block. While this will destroy the hash pointer linking the blocks, universities can simply update the links between the blocks by making new hashes. Since it is easy to do without a Proof-of-Work algorithm in a private blockchain, this process will not require vast computing powers.

In this case, the credibility of the information stored on such a blockchain rests only on the reliability of the universities controlling it. According to Michaels, there are, however, some other ways to create trustless blockchains that will allow one to delete information while maintaining the integrity of the blockchain (default privacy, or “privacy-by-design”).

The first method to allow this is encryption. In the example considered earlier, universities could encrypt each record with their own private and public key pair and store data on a chain in encrypted form. Instead of deleting the encrypted text itself, universities can simply remove the public key associated with it. Thus, although the cipher will still exist on the blockchain, no one will have access to the data hidden behind it. The question of whether such information is considered remote according to the GDPR remains open. But, at least according to the laws of Great Britain, it is considered, as Michaels notes.

The vulnerability of this method lies in the possibility that a public key can be stolen before it is deleted. Considering examples when hacker attacks went unnoticed for several years, one cannot be sure that one’s public key is not compromised and not stored “in stocks” of some party that is interested in one’s data or one’s funds and will use the stolen key when needed. Another threat that will almost inevitably become real at some point in the future is quantum computing, which will be able to break down any cryptography and thus free up locked personal information.

A second, more reliable method of deleting information involves the use of offline storage. Universities can get a degree hash that they want to confirm by inserting this degree into a hash function. Then they can store the resulting hash on the blockchain, and the degree itself (with all the personal data it contains) is off-chain. Deleting the information stored off-line does not pose a problem, and in this case, after its deletion, the data of interest to us will remain on the blockchain only as a hash. At the same time, one of the characteristics of a cryptographic hash is irreversibility, that is, “for a given value of the hash function m, it must be computationally impractical to find a data block X for which H (X) = m.” Consequently, even with a hash, it is impossible to get the information encrypted in this way.

As with the first method, it is not yet clear whether such a method is considered a complete deletion of information on the GDPR, since anyone who owns the initial data can create the same hash function, associate it with the hash stored on the blockchain and thus reveal the identity of the subject of personal data. Michaels notes that this problem can also be solved by adding a random sequence of values—nonce values​​—to personal data. This will provide protection, provided that the nonce itself is not compromised.

Regulators Are Trying It Too

In late September, France’s data protection authority, Commission nationale de l’informatique et des libertés (CNIL), released the official guide on the interaction of the GDPR and the blockchain. The experts highlighted several important aspects that this manual clarifies:

1. In the case of blockchain solutions, the users themselves can be considered the controlling party, that is, the subjects of personal data.

The management of CNIL identifies an additional category of “participants,” which includes those who carry out transactions on the blockchain, that is, have the right to write data to the blockchain and send it for validation to other network members (miners or operators of nodes). Since these participants themselves determine for what purposes personal data will be processed and choose the means of processing them (blockchain), then, according to CNIL, they act as a controlling party.

As Laura Jehl, one of the leaders of the blockchain division of the law firm BakerHostetler, notes, this part of CNIL management will have a significant positive impact on the identity blockchain solutions, which transfer control of personal data from the hands of corporations into the hands of users.

2. Cryptocurrency exchanges are also a controlling party.

According to CNIL, a controlling party is either an individual who processes personal data for professional or commercial purposes or a legal entity that writes personal data to a chain. “An individual who is involved in the purchase or sale of Bitcoin [. . .] can be considered the controlling party if it carries out these transactions within the framework of professional or commercial activities [and works] with the accounts of other individuals," the document says. By this definition, cryptocurrency exchanges directly fall under the definition of the controlling party for the GDPR and, accordingly, all the rules of the controlling party apply to them.

3. Miners or node operators are a processing side.

The management of CNIL proposes to assume that any participant confirming transactions or writing data to a chain thereby performs the processing of personal information. Therefore, individuals or companies that are miners or operators of nodes should be considered processors.

4. Blockchain is compatible with the required GDPR right to delete information.

CNIL offers another way to “destroy” personal data stored on the blockchain, namely, to make access to data almost impossible, “thereby approaching the effect of data destruction.” In addition, the manual indicates the possibility of destroying a private key or value, from which an encrypted or hashed result is generated. According to the authors of the document, this will be “sufficient to anonymize cryptographic obligations in such a way that they will no longer have the quality of personal data.” Considering that this method of “restricting access” to information by destroying the private key was described by Michaels, France joins the U.K. as one of the countries that equate access restriction to information destruction and consider it sufficient to meet the requirements of the GDPR.

5. Participants in a private blockchain must select one participant who will be the controlling party. Otherwise, they will be considered to have joint control.

In the case of data stored on a closed (private) blockchain, the controlling party is considered to be the enterprises that determine the goals of processing and inserting data on the chain. For them, the CNIL leaves two options:

 Create a legal entity in an association format;

 Choose one participant who will make decisions on the protection of personal information;

If the group does not choose one of these options, the principle of joint control will apply, that is, each participant will be held responsible for all private data of all participants of this platform;

6. Smart contract developers can choose their own role and be either the controlling or the processing party.

They will be considered as a processing party if they develop smart contracts at the request of a third party.

This aspect is considered in the manual on the example of specific smart contracts launched last year by the insurance company AXA. In this case, “the software developer offers the insurance company a solution in the form of a smart contract that allows the company to automate compensation payments to passengers when their flight is delayed. This developer will be considered the party that processes the data, and the insurance company the controlling party,” as read in the document.

7. Any business that wants to use blockchain technology should carefully evaluate privacy concerns before launching its decision.

According to the CNIL, an organization developing or using blockchain solutions should put compliance with personal data protection requirements above all, while taking care of meeting both the requirements of a GDPR and minimizing potential damage to users.

The CNIL encourages companies to start by asking whether a blockchain is really necessary in their particular case or the same result can be achieved by traditional centralized means. The manual of CNIL notes that “blockchain is not always the best technology for data processing. It can cause difficulties for the owner [of the data] given the requirements of the GDPR.”

8. Private blockchains should set a minimum number of nodes to protect data privacy.

The CNIL encourages blockchain operators to remember the threat of an attack of 51%, in which a participant controlling more than half of the hash rate can manage transactions and, in fact, the whole blockchain. Therefore, the management advises introducing a mandatory requirement for the minimum number of nodes, which will be enough to eliminate this risk. Another aspect concerns proper collusion protection and consolidated control over the network (and, accordingly, personal information) that node operators can exercise.

9. Personal data subjects should be able to challenge the results of the implementation of smart contracts. How, while it is not clear.

As Laura Jehl points out, the management of CNIL is inconsistent in the question of the extent to which the owner of personal data can dispute the result of a completed smart contract.

According to one of the points, the management demands that he intervene in the work of smart contracts, arguing that “the subject of personal data must have the right to human intervention in order to express his point of view and challenge the decision, after which the contract can be executed.”

At the same time, the following proposal says that it is enough to allow the personal data subject to challenge the smart contract after it has been completed: “[. . .] it is necessary that the controlling party provides the possibility of human intervention, which will allow [. . .] the personal data subject to challenge the decision, even if the contract has already been completed”.

Thus, the moment of the intervention of the subject of personal data is not clear, but the CNIL obliges developers to provide the data owner with such a level of influence on smart contracts.

10. There will be “right” and “wrong” ways to use the blockchain in terms of privacy and security, and more regulations will be developed later.

Jehl notes that on October 3, 2018, the European Parliament issued a resolution entitled “Distributed Registry Technologies and Blockchains: Building Trust without Intermediaries.” In it, the technology of a distributed ledger is defined as “a tool that extends the rights of citizens, giving them the ability to control their own data.” The resolution calls on countries in the European Parliament to promote the adoption and diffusion of technology. In her opinion, this is a sign that, despite fears, the European Union does not seek to limit the use of the blockchain, but wants to develop the safest framework for its use and even encourages its use.

Unsolved Problems

With all the optimism of the approaches to the reconciliation between the GDPR and blockchain, it is difficult not to notice that the requirement for the ability to delete information is technically impracticable. Further, it all depends on whether the parties standing behind the GDPR are ready to make concessions and pretend that the “loss of the key” is equal to the complete removal of the information encrypted with it (although it is not equal).

An even more philosophical question is whether the “changeable blockchain” is actually a blockchain? Here, again, it all depends on the willingness to make concessions in terminology, but the other side should already take a step towards it. At the same time, it should be noted that “true cypherpunks,” in particular, Timothy May, do not recognize private platforms as a blockchain in the true sense. And although the trend of creating closed corporate blockchains is actively developed by a number of consortia, for some members of the community, these developments are already outside the decentralized space. “The tension between the anonymous and KYC approaches is a key issue. It is ‘decentralization, anarchy, and peer-to-peer’ against ‘centralization, privacy, and backdoor’ [part of the algorithm that allows a developer to gain unauthorized access to data and control the system] [. . .] There are two ways: freedom vs. private and centralized systems,” as May writes.