Privacy and Self-Regulation: Markets for Electronic Privacy
Eli M. Noam
Professor of Finance and Economics
Director, Columbia Institute of Tele-information
The conventional wisdom is that electronic communications constitutes a major threat to individual privacy.
Wiretapping, eavesdropping, and data banks were part of the Big Brother and Nosy Sister scenario.
No problem is ever new. Jeopardies to privacy have been associated with electronic media from the beginning. Gossipy manual operators, party lines with curious neighbors, and the absence of a warrant requirement for wiretapping, all created privacy problems. The first American patent for a voice-scrambling device was issued only five years after the invention of the telephone.
The New York Police Department [pix], always on the technology frontier, listened in on telephones since at least 1895.
Today, a new generation of electronic privacy problems has emerged as more and more transactions are conducted on more and more networks.
Given that privacy is important to so many people, and given that information technology keeps raising new questions, what approach should be adopted to deal with privacy?
Understand that privacy is not a static concept but an interaction, in which the rights of different parties collide [pix]. A has a certain preference about the information he receives and lets out. B, on the other hand, may want to learn more about A, perhaps in order to protect herself. The controversies about caller-identification, or of AIDS disclosure by medical personnel, illustrate that privacy is an issue of control over information flows, with a much greater inherent complexity than a conventional "consumers versus business," or "citizens versus the state" analyses suggests. In this case, different parties have different preferences on information and need a way to synchronize these preferences or be at tension with each other. This would suggest the need for interactive negotiation over privacy.
As the new technological options emerge they create new opportunities but also new privacy problems. How can such problems be dealt with?
The primary policy response has been regulatory. Within that position there were two major directions--centralized general protection and decentralized ad-hoc protection. West European countries, in particular, have pursued the former, and passed comprehensive (omnibus) data protection laws and established institutionalized boards with fairly rigorous rules, and coordinated internationally on information collection and data flows. Additionally, the rules had a tendency to spread. 15 The United States, in contrast, has dealt with specific problems, one at a time, and with different approaches across the country.
The call for the state to control and protect privacy is a natural response, especially in the field of electronic communications, given their history around the world as either a state-controlled telephone or broadcast monopoly or tightly regulated sector. This regulatory perspective has led to a view of electronic privacy problems largely as an issue of rights versus the state or its regulated monopoly firms-- and to the question how to create such rights in the political, regulatory and legal sphere. But such a view is static: having a right is often believed to be the end of the story. Yet in most parts of society, the allocation of rights is only the beginning of a much more complex interaction. Rights are just the starting point.
First, there is the possibility of self-regulation, where an industry agrees to restrict some of its practices. [Pix] Realistically, though, self-regulation is rarely voluntary (unless serving an anti-competitive purpose): it usually occurs only under the threat of state regulation, and it can therefore be considered a variant of direct regulation.
The third major option is to rely on technology.
Encryption is a major way.
Methods of protecting information with encryption have become powerful and convenient.
Encryption permits individuals to control access to their information, to establish property rights, and to create the foundation for markets.
Encryption permit individuals to sell information about themselves directly, instead of letting various market researchers and credit checkers snoop in their demographics, personal history, and garbage cans. Individuals would define a set of access rights: their doctor only would be allowed to view medical records. Other categories of information would have free access, while others would be costly. Presumably, the more valuable information is to the buyer, and the more negative it is to the seller, the higher the price. This system would also allow an individual to keep track of who asked for the information. And, the reselling of the information would be authorized only by agreement of both key holders.
For privacy transactions to occur, however, there are several prerequisites. They include:
Sufficiently low transaction costs.
A legal environment that permits transactions to be carried out.
An industry structure, which permits transactions to occur.
Symmetry of information among the transacting parties.
No "market failure," i.e., no growing instability in the market.
The ability to create property rights, or to exclude.
That encryption, does not solve the problem of information resale to a third party C, once decrypted by the second party B. A buyer of information cannot be stopped from memorizing or photographing the de-crypted information on his screen and then reselling it.
So far, many of you will agree with me. But now I want to pose the critical question: does all this trading and bargaining over privacy makes a difference? The answer, surprisingly, is that it does not really make a difference, at least not under certain idealized conditions. But this is how academic analysis works. The first and obvious reason is that courts have been reluctant to grant property rights to personal information outside of the case of luminaries. In one case, Avrahami vs. U.S. News & World Report, a gutless court managed to hold for two organizations that exchanged subscriber name lists without permission, even though Virginia law clearly requires it.
This reluctance of courts (and probably of legislatures) to recognize property rights in residual information is not surprising in light of the role of direct marketing in the economy. However, property is only not established from above by formal statutes or court decisions, but also from below, by the simple mechanism of an individual's ability to exclude others. Good fences create good neighbors, and good transactions as well. Such access control creates the possibility of bargaining, and transform information from a "public good" (like a light house's flashing) to a private good (like a flashlight).
It is useful to consider as a framework for discussion the economic theorem of Nobel laureate Ronald Coase, a Chicago economist. Coase23 argues that in a conflict between the preferences of two people (but also rich), the final outcome will be determined by economic calculus and (as long as there are low transaction costs) result in the same outcome regardless of the allocation of rights. The reason is that people do not just stop dead in their tracks after some allocation is made, but they go on madly in optimizing their condition. This dynamic behavior drives lawyers and techies crazy, because it leads to equilibriums that are different from those designed by social or real engineers.
Let us look at privacy, using the example of a credit card billing record, and telemarketer who tries to sell a consumer something after first checking out her shopping habits as recorded on the credit card bills. Both the consumer and a third party vendor attribute a certain value to the information. For example, it may be worth $3 to the telemarketer to get the information. If necessary, it would be willing to pay a potential customer up to that amount.
Conversely, assume that the consumer would be willing to pay—grudgingly for sure—up to $4 to the telemarketer to keep its hands off the data, which she considers private. The $4 is the value she places on her privacy in this instance. Thus, if the telemarketer can access the information without the barrier of encryption, the consumer would "bribe" it not to access the information in order to keep her privacy. Right now we don’t have such a mechanism in place, but micro-transactions system could take care of it.
Suppose now the consumer gets smart and starts to encrypt the information. In that case, it is now the telemarketer who will try to bribe the consumer to get the information. But it would offer at most $3, and that is not enough for the consumer to consent. In other words, under the numbers chosen, the outcome is the same, with or without encryption.
Now suppose that for some reason the value to the telemarketer should rise, say to $6. In this situation, without encryption, the consumer could not pay the telemarketer enough for it not to access the information. And when there is encryption the telemarketer would pay for the consumer's cooperation by a payment of $4 or more, so that permission is granted. Implication: again, the same result applies with the earlier numbers, whether with or without encryption, no access to the information takes place either way: now, access to the information takes place.
In other words, the existence of encryption may largely determine who has to pay whom, not whether something will happen. Thus encryption does not necessarily determine whether access to private information actually take place, it only affects the final wealth distribution. This interactive concept is often difficult to grasp if one is used to think in absolutes. A rule is a rule, etc. but the analysis shows that encryption does not create privacy. It simply means that the consumers will get paid to give it up. In the past, such transactions were not possible. Encryption now creates barriers, which force outsiders to pay for what they used to get free. At the same time, the emergence of micro-transactions will make it possible for data miners to compensate consumers for their information.
So far we have discussed the role of markets in the provision of privacy in a largely pragmatic way--will it work? Yes, in some cases. No, in other cases. But at least as important is the normative question--should privacy be part of a market?
There are several critiques to the notion of markets for privacy:
To state that privacy is a basic human right is a noble sentiment with which I am in accord, but it does not follow that privacy therefore is outside the mechanism of transactions. As mentioned, a right is merely an initial allocation. It may be acquired without a charge and be universally distributed regardless of wealth, but is in the nature of humans to have varying preferences and needs, and to exchange what they have for what they want. Thus, whether we like it or not, people continuously trade in rights. In doing so they exercise a fundamental right, the right of free choice.
Departures from textbook civics are socially undesirable if the rights in question were given up under some form of duress, for example if in a single-employer town workers must agree not to assemble as a condition of employment. But when an informed, lucid, sober, and solvent citizen makes a choice freely, the objections are much harder to make. They then boil down to a transaction being against public policy. To make these transactions illegal, however, does not stop many of them, since there are willing buyers and sellers, a classic situation of victimless crime.
The extent of the success of such a ban depends, among other factors, on the ability of the state to insert itself into the transaction. In the case of privacy, which by its nature is an interactive use of information, such insertion is difficult. And one can expect imaginative schemes to circumvent such a prohibition.
Those who believe that the market approach to privacy protection is overly generous to business might find themselves pleasantly surprised because the tools of access control such as encryption will have shifted the balance of power to individuals and to the protection of privacy. Indeed, it will be the business users of personal information who will end up objecting to transactions. They are, of course, worried that while they (together with politicians and parties) have today relatively free access to individuals or to data about them, a system where they might have to pay compensation in return for consent might become expensive. They are correct, but what can they do about it? Access to an individual, even if sanctioned by law, will require the latter's cooperation. Right now, individuals do not yet have effective means to make those desiring personal information compensate them. But the tools to change this, such as encryption or caller identification, are here. Soon, equipment makers and communications service providers will enable consumers to conveniently sell access. And when this happens, those marketers who claim to live by the free market will also have to play (and pay) by its rules.
So while I marvel at this elegant tool that Duffie and his fellows have created, and while I fully understand the difference between a theoretical analysis and the real world, I do not believe that conceptually crypto creates more privacy, after transactions have taken place. They do not really affect transactions between second and third parties, and they do not affect the transactions between first and second parties. What the encrypters do instead is to make second parties pay, and they therefore redistribute more wealth for consumers. They enable more transactions and more markets in private information. But mostly, they hold re-distribute money to consumers, part of which is then taken away in higher prices. Thus, we should call the people who created these tools crypto-capitalists, and salute them all the way to our bank.