Can P3P help to protect privacy worldwide?

Rüdiger Grimm
GMD Darmstadt
Dolivostr. 15
D-64293 Darmstadt
+49 6151 869716
grimm@darmstadt.gmd.de

Alexander Rossnagel
University Kassel
Nora-Platiel-Strasse 5
D-34109 Kassel
+49 561 804 3130
rosnagel@hrz.uni-kassel.de

ABSTRACT

Privacy is a basic cultural requirement, often regulated by national law, but not everywhere in the same way. Privacy protection must be effective across national borders. Technical tools and procedures can help to enforce and propagate privacy protection for Internet communication worldwide. The "Platform for Privacy Preferences Project (P3P)" is a standardization approach of the World Wide Web Consortium for privacy protection of the Web. This article describes the history and current state of P3P and evaluates the effect of P3P against legal requirements, particularly against those strict laws in Germany and Europe. This article is an interdisciplinary cooperation with technical and legal background.

Keywords

Privacy, regulation, self-regulation, P3P, PICS, privacy technology, law enforcement

1. INTRODUCTION - PRIVACY ACROSS NATIONAL BORDERS

The regulatory power of a national state, even of the European Union, is limited with respect to global data networks: The German privacy protection law is applicable if the provider of Internet services has its headquarters or at least a subsidiary in Germany. However, with respect to providers who offer their Internet services via the Internet from foreign countries, privacy protection law and privacy protection control are powerless.

The more privacy protection is withdrawn from the sphere of influence of the national legislator, the more it has to become effective on a worldwide basis. Due to the lack of an effective global legal system, this is only possible if privacy protection is integrated into technology. Unlike privacy protection law, privacy protection technology is effective worldwide and, unlike legislators, its providers and users are fast learning systems which are able to respond quickly to new technical challenges. German law has defined the term of "informational self-determination". Technical tools can help persons to protect information privacy actively. They can take appropriate measures to allow the desired processing of their data and to prevent any undesired processing.

A first approach to the required worldwide privacy protection standard might be the P3P standardization initiative.

2. W3C STANDARDIZATION ACTIVITIES

The World Wide Web Consortium (W3C) is a voluntary amalgamation of enterprises and organizations which are interested in standardizing WWW technologies. The "Platform for Privacy Preferences Project (P3P)", is dedicated to standardize functions supporting privacy protection [6].

2.1 Brief History of P3P

P3P is based on the "labeling" schemes for filtering unwanted Web content. In particular to protect children against pornographic and violence glorifying WWW offerings, the W3C started a standardization initiative under the name "Platform for Internet Content Selection (PICS)" in 1996 [7].

The labeling idea of PICS opened the way to a technical support of privacy protection. Reidenberg’s approach to using PICS for signaling privacy options has been an interesting starting point [8]. With its further development within P3P, the W3C wants to consider and implement suggestions from many legal or political privacy protection principles. These include the OECD Guidelines on the Protection of Privacy of 1981, and others listed in the P3P Guiding Principles [6, Appendix 7].

Another reason for P3P was the fact that the service providers want to receive correct data. For this purpose, they want to increase confidence through more privacy protection transparency. They rely on the convenience that only correct personal data are contained in the profile stored on user side and transmitted if required.

2.2 Past Design of P3P and Reasons for its Failure

According to the first P3P drafts until August 1999, users and providers of an Internet service should negotiate privacy practices on a free and equal basis. In compliance with the P3P rules, every user would store his requirements for the privacy practice of an Internet service in the form of formalized "preferences" given to his browser. Every Internet service would send its privacy preferences to the user in the form of formalized "proposals". The Internet service would offer one or several policies and the user would select one or return a counterproposal. This sending and returning may continue until an agreement is reached or the connection is discontinued. After having come to an agreement, the user would return his personal data as agreed and the provider would supply the service. The first drafts therefore distinguish three communication phases:

Figure 1: Notice, choice, consent, data transfer and service of P3P in August 1999 [5]

P3P describes three different areas: First, it specifies privacy protection policies as a set of formalized statements. A policy can say, for example, that the service provider uses name and address of a buyer only for the delivery of goods while he wants to analyze the connection information to improve his own offering. The provider may specify general purposes predefined by the standard. In addition, he can also describe an intended purpose in optional wording.

Secondly, P3P describes the decision-making process of the user as well as a negotiation protocol of a privacy policy. For example, a P3P user tool can automatically analyze general purposes for personal data. Richer descriptions for intended purposes often require explicit decisions by the users. P3P provides for both. The user would return his consent to the service provider in the form of a hashed policy text approved by both parties. Because of this interactive phase, the data elements which describe a policy are called "proposal" and a hashed consent is called "propID". For the response to a "proposal", user and service can communicate a "status" to each other.

Thirdly, P3P describes a protocol for the transfer of personal data in the form of "data" fields.

From the very beginning in 1998, the drafts were complemented by demonstrator software [1]. However, they were never implemented by real products.

The criticism of P3P aimed mainly at the complexity of the necessary implementation. In fact, P3P is much more complex than PICS. It does not only require the implementation of extensions of the web protocol HTTP for the new data elements, but also the implementation of a new protocol for negotiation. This requires extensive automatic decision-making mechanisms on both sides as well as a status monitoring of policy negotiations.

In addition, a standardized data transfer requires new mechanisms to hold personal data on user side in a "repository" and to access them from inside and outside. On the other hand, many services have already got proprietary data collection methods, for example, by form inquiries.

Critics say that negotiation is not really needed. They say that much more important is the simple information of the users about the privacy practice of a service and the opportunity of the users to say "yes" or "no", i.e. "notice" and "choice". The further access to personal data within a service might be performed by the services according to their own rules as before and would not really need a standardized method.

2.3 New Proposal "P3P Version 1"

A productive response to this criticism was given in the second half of 1999. The basic idea of the new draft dated November 1999 and referred to as "Version 1" is to maintain the key concept, but to structure complexity such that one begins with "notice and choice" alone. "Notice and choice" are anyway the first steps of any negotiation. Then one might immediately begin with implementing the formalized description of privacy protection policies and their local choice. The subsequent steps, negotiation and data transfer, could be standardized later.

At the date of writing this article, this "Version 1" is still under discussion. The latest draft of "P3P Version 1" available today is dated May 10, 2000 [6].

Unfortunately, P3P Version 1 was not only modularized as mentioned above, but it was slimmed down such that negotiation, explicit consent by the user as well as data transfer were completely deleted. Therefore, in Version 1, only one and a half phases have been left over from the three phases, namely:

Figure 2: "Notice and choice" of P3P V1, May 2000

Only one of the initially three specification areas has been left, namely the formalization of the description of a privacy policy. Accordingly, the corresponding data element is no longer referred to as a "proposal" but as a "policy". A service provider can no longer offer several "policies", but only one. And a user can neither select among several proposals nor make a counterproposal, but he can only accept or reject "silently". If rejecting, he must reckon with denial of service. In this way, the explicit consent has also been put on ice for the time being.

P3P Version 1 has moved its focus from protocol considerations to policy specification. Several new mechanisms which should be kept in future versions have been added. On the one hand, the draft recommends the creation of a "safe zone" on the user’s side. That is the information phase for the user in which he should not yet release any personal data. On the other hand, the draft allows policies to be expressed in a more flexible and more precise manner. The same policy can describe different practices with respect to different subareas of a web page. A policy can now include the description of dispute resolution mechanisms. It can specify remedies in case the service provider breaks its policy. And last, but not least, the specification is open for new extension fields for the "policy" (e.g., period of validity; geographical classification; alternatives for cookie policy) and for the data-related "statements" in a policy (e.g. cancellation periods).

Altogether, the new draft dated May 2000 (already in November 1999) is the first right step towards implementing technical privacy protection. However, doing without the facility for the user to select, to make his own proposals and to give his consent makes the first step too small.

3. POSSIBLE USE OF P3P

Even if the protocol is not completely consistent with our privacy protection requirements (3.2), it might find reasonable fields of application (3.1) and it can be advanced to a more comprehensive privacy protection (3.3).

3.1 Examples of Application

An essential element of P3P is the formalized description of privacy practices for which a specific language (APPEL) has been developed: A provider formulates his "policy" and refers to it in the first response to a web inquiry. With the aid of a P3P user agent, a user formulates his user "preferences", stores them locally and compares them with the downloaded policies of the providers. The user agent initiates the further procedure in accordance with the result - either automatically or after a local interaction with the user. Version 1 of the P3P standard describes possible applications for the user [6, 1.1.4 "P3P User Agents"].

A P3P user agent can guarantee that a user accepts such services only which satisfy his privacy requirements. A possible application of P3P will be the rating of privacy protection friendliness of Internet services. Formal descriptions allow automatic (or at least semi-automatic) rating thus enabling a comprehensive rating of Internet services. In addition, this will increase transparency when comparing different Internet services. Newly emerging enterprises, so called "infomediaries", which offer privacy protection services for users as data trustees, will therefore use P3P most intensively.

3.2 Conformity with German Privacy Protection Law

Germany was one of the first countries to establish a strict law on data protection for Internet services, the Teleservices Data Protection Act (TDDSG) and the Agreement on Media Services of the Federal Länder (MDStV) [4]. P3P contains mechanisms which support the following privacy protection requirements of the Teleservices Data Protection Act (TDDSG) and the Agreement on Media Services of the Federal Länder (MDStV):

From the viewpoint of German privacy protection law, P3P only improves the transparency of data processing. It does not support system data protection. It does not enforce a binding of data to their purpose. And it does not support user control functions. In particular, the following requirements are not supported and have to be guaranteed in another way:


3.3 Extensions of P3P

P3P is a standard which allows individual further developments. Therefore, in Germany or Europe, P3P could be complemented such that its applications meet further requirements of privacy protection. We see the following three starting points for this:

4. EVALUATION

It makes a big difference whether the standard is applied on the basis of statutory privacy protection or as a substitute for privacy protection law. P3P was created against the background of the situation in the USA, namely a conflict between governmental privacy protection regulation and self-regulation. It is to strengthen self-regulation as a technical support together with the privacy policies of the providers and the "seal programs" of trustworthy third parties. In this context, the weakness in system data protection becomes obvious: P3P cannot enforce policies or agreed purposes, it can only describe them; there are no mechanisms to improve data thrift. In Germany, P3P complements privacy protection law. Purpose definition, obligation to inform, required consent and user rights are mandatory here. P3P provides technical support to some but not all legal requirements.

P3P Version 1 is weak with respect to the fact that the user can only choose between acceptance or refusal of a privacy policy (take it or leave it). This allows no feedback to the provider. The provider obtains no information about the protection privacy requirements of the customers and about the reason why they do not visit his WWW offering or leave it again rather fast.

P3P should be used nevertheless. The use of P3P software would increase privacy protection awareness of all people involved. The providers would present the privacy protection level of their service. Providers would reveal the extent to which they comply with law (in Germany: with TDDSG or MDStV). P3P would provide a technical facility to transport privacy protection audit seals or other seals. The user can obtain information about the privacy policy of competing providers and has at least the choice to avoid the use of insufficiently protected services or not to visit services which do not use P3P. Consumer associations or privacy protection officers would be able to design and distribute "popular user preferences" and "popular policies". This might result in a privacy protection culture. Policies and preferences conforming with strict local laws should be distributed in Europe, and even worldwide. In this way, an Internet service which meets strict requirements might propagate its privacy protection oriented practice beyond ist local borders throughout the Internet.

Introducing P3P Version 1 is relatively easy because in a first step it requires software extensions only on the side of the user and these extensions can fast be offered worldwide. This would soon provide no perfect, but an internationally effective tool. This could be a first practical step with further steps to follow. If the communication functions of the older P3P versions are handed in later and if user control functions are added, we might follow the right path.

5. ACKNOWLEDGEMENTS

We thank the German Federal Ministry for Economics and Technology for funding our research project "Datenschutz in Telediensten" (DASIT – privacy protection in Internet services) conducted by DG BANK Frankfurt, GMD Darmstadt, the University of Kassel. [2]

6. REFERENCES

[1] AT&T, Cranor, Lorry F.: P3P Privacy Tools. Esp.: AT&T Policy Generator (outdated vocabulary) and PrivacyMinder, www.research.att.com/projects/p3p/

[2] Grimm, R.; Löhndorf, N.; and Scholz, Ph.: Datenschutz in Telediensten (DASIT) am Beispiel von Einkaufen und Bezahlen im Internet ("data protection for Internet services (DASIT) at an example of purchase and payment in the Internet"). In: DuD 5/1999, Mai 1999, Vieweg Verlag, Wiesbaden, 272-276.

[3] Grimm, Rüdiger: User Control over Personal Web Data. Proceedings of the EEMA/TeleTrusT Conference ISSE'99, Berlin, October 99, 14 pages.

[4] F.R. Germany: Information and Communication Services Act - Informations- und Kommunikationsdienste-Gesetz (IuKDG). Incl. TDDSG. Deutscher Bundestag, 13.7.1997, http://www.iid.de/rahmen/iukdgebt.html/.

[5] W3C: The Platform for Privacy Preferences 1.0 Specification ("old version"). W3C Working Draft 26 August 1999.

[6] W3C: The Platform for Privacy Preferences 1.0 (P3P1.0) Specification. W3C Working Draft 10 May 2000.

[7] W3C: Platform for Internet Content Selection (PICS). PICS Technical Specifications, Completed Specifications for PICS-1.1, Service Descriptions (Oct 1996) etc. www.w3.org/PICS/; in particular PICS Signed Labels (DSig) and "PICS Statement of Principles".

[8] Reidenberg, Joel R.: The Use of Technology to Assure Internet Privacy: Adapting Labels and Filters for Data Protection. Lex Electronica Vol. 3 No 2, ISSN 1201-7302, 1997, www.lex-electronica.org/articles/v3-2/reidenbe.html.

[9] Rossnagel, Alexander: Datenschutzaudit – Konzeption, Umsetzung, rechtliche Regelung ("data protection audit – architecture, implementation, legal regulation"). ISBN 3528057343, Vieweg, Wiesbaden usw. 2000, 162 pages.