Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Multi-Party Protocols, Information Complexity and Privacy

Multi-Party Protocols, Information Complexity and Privacy We introduce a new information-theoretic measure, which we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: The Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; and an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bitwise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is (n). http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Computation Theory (TOCT) Association for Computing Machinery

Multi-Party Protocols, Information Complexity and Privacy

Loading next page...
 
/lp/association-for-computing-machinery/multi-party-protocols-information-complexity-and-privacy-O6QEm1R4M6
Publisher
Association for Computing Machinery
Copyright
Copyright © 2019 ACM
ISSN
1942-3454
eISSN
1942-3462
DOI
10.1145/3313230
Publisher site
See Article on Publisher Site

Abstract

We introduce a new information-theoretic measure, which we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: The Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; and an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bitwise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is (n).

Journal

ACM Transactions on Computation Theory (TOCT)Association for Computing Machinery

Published: Mar 17, 2019

Keywords: Multi-party communication complexity

References