Selasa, 10 April 2018

Sponsored Links

Ethical, Social, and Political Issues in E-commerce - ppt download
src: slideplayer.com

Privacy-Enhancing Technologies (PET) is the standardized term that refers to specific methods that act in accordance with the laws of data protection - PETs' allow online users to protect the privacy of their personally identifiable information (PII) provided to and handled by services or applications.

Privacy-enhancing technologies can also be defined as:

Privacy-Enhancing Technologies is a system of ICT that measures the protection of informational privacy by eliminating or minimising personal data thereby preventing unnecessary or unwanted processing of personal data, without the loss of the functionality of the information system.
(van Blarkom, Borking & Olk 2003)


Video Privacy-enhancing technologies



Goals of PETs

The objective of PETs is to protect personal data and ensure the users of technology that their information is confidential and management of data protection is a priority to the organizations who withhold responsibility for any PII - allowing users to take one or more of the following actions related to their personal data sent to and used by, online service providers, merchants or other users.

The goal of privacy-enhancing technologies include increasing control over personal data sent to, and used by, online service providers and merchants (or other online users)(self-determination). PETs aim to minimize personal data collected and used by service providers and merchants, use pseudonyms or anonymous data credentials to provide anonymity, and strive to achieve informed consent about giving personal data to online service providers and merchants. In Privacy Negotiations, consumers and service providers establish, maintain, and refine privacy policies as individualized agreements through the ongoing choice among service alternatives, therefore providing the possibility to negotiate the terms and conditions of giving personal data to online service providers and merchants (data handling/privacy policy negotiation). Within private negotiations, the transaction partners may additionally bundle the personal information collection and processing schemes with monetary or non-monetary rewards.

PETs provide the possibility to remotely audit the enforcement of these terms and conditions at the online service providers and merchants (assurance), allow users to log, archive and look up past transfers of their personal data, including what data has been transferred, when, to whom and under what conditions, and facilitate the use of their legal rights of data inspection, correction and deletion.


Maps Privacy-enhancing technologies



Existing PETs

Examples of existing privacy enhancing technologies are:

  • Communication anonymizers hiding the real online identity (email address, IP address, etc.) and replacing it with a non-traceable identity (disposable / one-time email address, random IP address of hosts participating in an anonymising network, pseudonym, etc.). They can be applied to email, Web browsing, P2P networking, VoIP, Chat, instant messaging, etc.
  • Shared bogus online accounts. One person creates an account for MSN, providing bogus data for Name, address, phone number, preferences, life situation etc. They then publish their user-ID and password on the Internet. Everybody can now use this account comfortably. Thereby the user is sure that there is no personal data about him in the account profile. (Moreover, he is freed from the hassle of having to register at the site himself.)
  • Obfuscation refers to the many practices of adding distracting or misleading data to a log or profile, which may be especially useful for frustrating precision analytics after data has already been lost or disclosed. Its effectiveness against humans is questioned, but it has greater promise against shallow algorithms.
  • Access to personal data: The service provider's infrastructure allows users to inspect, correct or delete all their data stored at the service provider.
  • Enhanced privacy ID (EPID) is a digital signature algorithm supporting anonymity. Unlike traditional digital signature algorithms (e.g., PKI), in which each entity has a unique public verification key and a unique private signature key, EPID provides a common group public verification key associated with many of unique private signature keys. EPID was created so that a device could prove to an external party what kind of device it is (and optionally what software is running on the device) without needing to also reveal exact identity, i.e., to prove you are an authentic member of a group without revealing which member. It has been in use since 2008.

Investor Relations, IQVIA Holdings Inc.
src: s22.q4cdn.com


Future PETs

Examples of privacy enhancing technologies that are being researched or developed include limited disclosure technology, anonymous credentials such as online car rental, negotiation and enforcement of data handling conditions, and data transaction log. Limited disclosure technology provides a way of protecting individuals' privacy by allowing them to share only enough personal information with service providers to complete an interaction or transaction. This technology is also designed to limit tracking and correlation of users' interactions with these third parties. Limited disclosure uses cryptographic techniques and allows users to retrieve data that is vetted by a provider, to transmit that data to a relying party, and have these relying parties trust the authenticity and integrity of the data. Anonymous credentials are asserted properties or rights of the credential holder that don't reveal the true identity of the holder; the only information revealed is what the holder of the credential is willing to disclose. The assertion can be issued by the user himself/herself, by the provider of the online service or by a third party (another service provider, a government agency, etc.). For example:

Online car rental. The car rental agency doesn't need to know the true identity of the customer. It only needs to make sure that the customer is over 23 (as an example), that the customer has a drivers license, health insurance (i.e. for accidents, etc.), and that the customer is paying. Thus there is no real need to know the customers name nor their address or any other personal information. Anonymous credentials allow both parties to be comfortable: they allow the customer to only reveal so much data which the car rental agency needs for providing its service (data minimization), and they allow the car rental agency to verify their requirements and get their money. When ordering a car online, the user, instead of providing the classical name, address and credit card number, provides the following credentials, all issued to pseudonyms (i.e. not to the real name of the customer):

  • An assertion of minimal age, issued by the state, proving that the holder is older than 23 (note: the actual age is not provided)
  • A driving licence, i.e. an assertion, issued by the motor vehicle control agency, that the holder is entitled to drive cars
  • A proof of insurance, issued by the health insurance
  • Digital cash

Negotiation and enforcement of data handling conditions. Before ordering a product or service online, the user and the online service provider or merchant negotiate the type of personal data that is to be transferred to the service provider. This includes the conditions that shall apply to the handling of the personal data, such as whether or not it may be sent to third parties (profile selling) and under what conditions (e.g. only while informing the user), or at what time in the future it shall be deleted (if at all). After the transfer of personal data took place, the agreed upon data handling conditions are technically enforced by the infrastructure of the service provider, which is capable of managing and processing and data handling obligations. Moreover, this enforcement can be remotely audited by the user, for example by verifying chains of certification based on Trusted computing modules or by verifying privacy seals/labels that were issued by third party auditing organisations (e.g. data protection agencies). Thus instead of the user having to rely on the mere promises of service providers not to abuse personal data, users will be more confident about the service provider adhering to the negotiated data handling conditions Lastly, the data transaction log allows users the ability to log the personal data they send to service provider(s), the time in which they do it, and under what conditions. These logs are stored and allow users to determine what data they have sent to whom, or they can establish the type of data that is in possession by a specific service provider. This leads to more transparency, which is a pre-requisite of being in control.


Conference programme: Smart Sharing - European Data Protection ...
src: edps.europa.eu


See also

  • Crypto-shredding
  • Cypherpunk
  • Digital credentials
  • Enhanced privacy ID (EPID)
  • I2P - The Anonymous Network
  • Identity management
  • Information privacy
  • Information processing
  • Information security
  • Privacy
  • privacy by design
  • Privacy Engineering
  • Privacy-enhanced Electronic Mail
  • Privacy software
  • Privacy policy

Aircloak (@aircloak) | Twitter
src: pbs.twimg.com


References

  • van Blarkom, G.W.; Borking, J.J.; Olk, J.G.E. (2003). "PET". Handbook of Privacy and Privacy-Enhancing Technologies. (The Case of Intelligent Software Agents). ISBN 90-74087-33-7. 

Notes


UCL InfoSec Group (@uclisec) | Twitter
src: pbs.twimg.com


External links

PETs in general:

  • Stanford CIS wiki database of PETs
  • The EU PRIME research project (2004 to 2008) aiming at studying and developing novel PETs
  • About PETs from the Center for Democracy and Technology
  • Annual symposium on PETs
  • Report about PETs from the META Group, published by the Danish ministry of science
  • Activities of the EU Commission in the area of PETs broken link

Anonymous credentials:

  • IBM Zürich Research Lab's idemix
  • Stefan Brands' U-Prove Digital credential 'credentica'
  • which is now owned by Microsoft U-Prove

Privacy policy negotiation:

  • The W3C's P3P
  • IBM's EPAL
  • Sören Preibusch: Implementing Privacy Negotiations in E-Commerce, Discussion Papers of DIW Berlin 526, 2005

Source of the article : Wikipedia

Comments
0 Comments