research product . 2015

Conservative or liberal? : personalized differential privacy

Jorgensen, Zach; Yu, Ting; Cormode, Graham;
Open Access English
  • Published: 01 Jun 2015
  • Country: United Kingdom
Abstract
Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to...
Subjects
free text keywords: QA76
Related Organizations
Funded by
EC| DAPPER
Project
DAPPER
Doing Anonymization Practically, Privately, Effectively and Reusably
  • Funder: European Commission (EC)
  • Project Code: 618202
  • Funding stream: FP7 | SP3 | PEOPLE
,
NSF| CAREER: Trust and Privacy Management for Online Social Networks
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0747247
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
,
NSF| TWC SBE: Medium: Collaborative: User-Centric Risk Communication and Control on Mobile Devices
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 1314229
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
33 references, page 1 of 3

[1] M. S. Ackerman, L. F. Cranor, and J. Reagle. Privacy in e-commerce: examining user scenarios and privacy preferences. In ACM EC, pages 1-8. 1999.

[2] A. Acquisti and J. Grossklags. Privacy and rationality in individual decision making. IEEE Security & Privacy, 2:24-30, 2005. [OpenAIRE]

[3] M. Alaggan, S. Gambs, and A.-M. Kermarrec. Heterogeneous differential privacy. www.helwan.edu.eg/university/staff/Dr.MohamedNabil/ hdp.pdf 2014.

[4] B. Berendt, O. Gu¨nther, and S. Spiekermann. Privacy in e-commerce: stated preferences vs. actual behavior. volume 48, pages 101-106. CACM, 2005.

[5] A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the sulq framework. In ACM PODS, pages 128-138. 2005.

[6] C. Dwork. Differential privacy. In ICALP, pages 1-12, 2006.

[7] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor. Our data, ourselves: Privacy via distributed noise generation. In EUROCRYPT, pages 486-503. 2006. [OpenAIRE]

[8] C. Dwork and J. Lei. Differential privacy and robust statistics. In ACM STOC, pages 371-380. 2009.

[9] C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In TCC, pages 265-284. 2006.

[10] J. Gehrke, M. Hay, E. Lui, and R. Pass. Crowd-blending privacy. In CRYPTO, pages 479-496. 2012.

[11] S. Kasiviswanathan, H. Lee, K. Nissim, S. Raskhodnikova, and A. Smith. What can we learn privately? In IEEE FOCS, pages 531-540. 2008. [OpenAIRE]

[12] G. Kellaris and S. Papadopoulos. Practical differential privacy via grouping and smoothing. In PVLDB, volume 6, 2013.

[13] N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. In IEEE ICDE, pages 106-115, 2007.

[14] N. Li, W. Qardaji, and D. Su. On sampling, anonymization, and differential privacy or, k-anonymization meets differential privacy. In ASIACCS, pages 32-33. 2012.

[15] B.-R. Lin, Y. Wang, and S. Rane. On the benefits of sampling in privacy preserving statistical analysis on distributed databases. arXiv:1304.4613, 2013.

33 references, page 1 of 3
Abstract
Differential privacy is widely accepted as a powerful framework for providing strong, formal privacy guarantees for aggregate data analysis. A limitation of the model is that the same level of privacy protection is afforded for all individuals. However, it is common that the data subjects have quite different expectations regarding the acceptable level of privacy for their data. Consequently, differential privacy may lead to insufficient privacy protection for some users, while over-protecting others. We argue that by accepting that not all users require the same level of privacy, a higher level of utility can often be attained by not providing excess privacy to...
Subjects
free text keywords: QA76
Related Organizations
Funded by
EC| DAPPER
Project
DAPPER
Doing Anonymization Practically, Privately, Effectively and Reusably
  • Funder: European Commission (EC)
  • Project Code: 618202
  • Funding stream: FP7 | SP3 | PEOPLE
,
NSF| CAREER: Trust and Privacy Management for Online Social Networks
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0747247
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
,
NSF| TWC SBE: Medium: Collaborative: User-Centric Risk Communication and Control on Mobile Devices
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 1314229
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
33 references, page 1 of 3

[1] M. S. Ackerman, L. F. Cranor, and J. Reagle. Privacy in e-commerce: examining user scenarios and privacy preferences. In ACM EC, pages 1-8. 1999.

[2] A. Acquisti and J. Grossklags. Privacy and rationality in individual decision making. IEEE Security & Privacy, 2:24-30, 2005. [OpenAIRE]

[3] M. Alaggan, S. Gambs, and A.-M. Kermarrec. Heterogeneous differential privacy. www.helwan.edu.eg/university/staff/Dr.MohamedNabil/ hdp.pdf 2014.

[4] B. Berendt, O. Gu¨nther, and S. Spiekermann. Privacy in e-commerce: stated preferences vs. actual behavior. volume 48, pages 101-106. CACM, 2005.

[5] A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the sulq framework. In ACM PODS, pages 128-138. 2005.

[6] C. Dwork. Differential privacy. In ICALP, pages 1-12, 2006.

[7] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor. Our data, ourselves: Privacy via distributed noise generation. In EUROCRYPT, pages 486-503. 2006. [OpenAIRE]

[8] C. Dwork and J. Lei. Differential privacy and robust statistics. In ACM STOC, pages 371-380. 2009.

[9] C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In TCC, pages 265-284. 2006.

[10] J. Gehrke, M. Hay, E. Lui, and R. Pass. Crowd-blending privacy. In CRYPTO, pages 479-496. 2012.

[11] S. Kasiviswanathan, H. Lee, K. Nissim, S. Raskhodnikova, and A. Smith. What can we learn privately? In IEEE FOCS, pages 531-540. 2008. [OpenAIRE]

[12] G. Kellaris and S. Papadopoulos. Practical differential privacy via grouping and smoothing. In PVLDB, volume 6, 2013.

[13] N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. In IEEE ICDE, pages 106-115, 2007.

[14] N. Li, W. Qardaji, and D. Su. On sampling, anonymization, and differential privacy or, k-anonymization meets differential privacy. In ASIACCS, pages 32-33. 2012.

[15] B.-R. Lin, Y. Wang, and S. Rane. On the benefits of sampling in privacy preserving statistical analysis on distributed databases. arXiv:1304.4613, 2013.

33 references, page 1 of 3
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue