Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Big Data Privacy Preservation for Cyber-Physical SystemsPreliminaries

Big Data Privacy Preservation for Cyber-Physical Systems: Preliminaries [In this chapter, we briefly introduce the differential privacy technique and its variants to effectively protect the participants’ privacy in different CPS. The concept of differential privacy was first proposed by Dwork (Differential privacy: a survey of results. In: International conference on theory and applications of models of computation, Xi’an, 2008) which specifies that any individual has a very small influence on the (distribution of the) outcome of the computation. Differential privacy (DP) aims to exploit the statistical information without disclosure of the data providers’ privacy. Differential privacy is a formal definition of data privacy, which ensures that any sequence of output from data set (e.g., responses to queries) is “essentially” equally likely to occur, no matter any individual is present or absent (Dwork et al., Found Trends Theor Comput Sci 9(3–4):211–407, 2014; Baranov et al., Am Econ J Microecon 9(3):1–27, 2017; Jin and Zhang, Privacy-preserving crowdsourced spectrum sensing. In: Proceeding of the IEEE international conference on computer communications (INFOCOM), pp. 1–9, 2016). In this chapter, we illustrate three variants of differential privacy, centralized different privacy, distributed differential privacy and local differential privacy that are applied to various CPS applications described in the book.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Loading next page...
 
/lp/springer-journals/big-data-privacy-preservation-for-cyber-physical-systems-preliminaries-hCv4ykc3sH
Publisher
Springer International Publishing
Copyright
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2019
ISBN
978-3-030-13369-6
Pages
11 –20
DOI
10.1007/978-3-030-13370-2_2
Publisher site
See Chapter on Publisher Site

Abstract

[In this chapter, we briefly introduce the differential privacy technique and its variants to effectively protect the participants’ privacy in different CPS. The concept of differential privacy was first proposed by Dwork (Differential privacy: a survey of results. In: International conference on theory and applications of models of computation, Xi’an, 2008) which specifies that any individual has a very small influence on the (distribution of the) outcome of the computation. Differential privacy (DP) aims to exploit the statistical information without disclosure of the data providers’ privacy. Differential privacy is a formal definition of data privacy, which ensures that any sequence of output from data set (e.g., responses to queries) is “essentially” equally likely to occur, no matter any individual is present or absent (Dwork et al., Found Trends Theor Comput Sci 9(3–4):211–407, 2014; Baranov et al., Am Econ J Microecon 9(3):1–27, 2017; Jin and Zhang, Privacy-preserving crowdsourced spectrum sensing. In: Proceeding of the IEEE international conference on computer communications (INFOCOM), pp. 1–9, 2016). In this chapter, we illustrate three variants of differential privacy, centralized different privacy, distributed differential privacy and local differential privacy that are applied to various CPS applications described in the book.]

Published: Mar 26, 2019

There are no references for this article.