What Can We Learn Privately?

What Can We Learn Privately?

February 13, 2013 | Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, Adam Smith
The paper "What Can We Learn Privately?" by Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith explores the capabilities of private learning algorithms, focusing on differential privacy. The authors investigate which concept classes can be learned privately, meaning that the output of the algorithm does not heavily depend on any single input or specific training example. They aim to understand the resources required for private learning in terms of samples, computation time, and interaction. Key contributions include: 1. **Private Version of Occam’s Razor**: A generic private learning algorithm that uses a sample size proportional to the logarithm of the concept class size. This is a private analogue of the cardinality version of Occam’s razor. 2. **Efficient Private Learner for Parity**: An efficient, distribution-free differentially private PAC learner for the class of parity functions, which is thought to be hard to learn with random classification noise. 3. **Equivalence of Local and SQ Learning**: The authors show that a concept class is learnable by a local differentially private algorithm if and only if it is learnable in the statistical query (SQ) model. This equivalence highlights the strong similarity between local learning and SQ learning. 4. **Separation of Interactive and Noninteractive Local Learning**: They construct a concept class, called masked-parity, that is efficiently learnable by interactive local algorithms but requires an exponential number of samples to be learned by noninteractive local algorithms. The paper also discusses the implications of these results, including the feasibility of private learning and the limitations of local algorithms. It concludes with a complexity-theoretic picture of learnable and privately learnable concept classes, showing that many concept classes can be learned privately with a polynomial number of samples and time.The paper "What Can We Learn Privately?" by Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith explores the capabilities of private learning algorithms, focusing on differential privacy. The authors investigate which concept classes can be learned privately, meaning that the output of the algorithm does not heavily depend on any single input or specific training example. They aim to understand the resources required for private learning in terms of samples, computation time, and interaction. Key contributions include: 1. **Private Version of Occam’s Razor**: A generic private learning algorithm that uses a sample size proportional to the logarithm of the concept class size. This is a private analogue of the cardinality version of Occam’s razor. 2. **Efficient Private Learner for Parity**: An efficient, distribution-free differentially private PAC learner for the class of parity functions, which is thought to be hard to learn with random classification noise. 3. **Equivalence of Local and SQ Learning**: The authors show that a concept class is learnable by a local differentially private algorithm if and only if it is learnable in the statistical query (SQ) model. This equivalence highlights the strong similarity between local learning and SQ learning. 4. **Separation of Interactive and Noninteractive Local Learning**: They construct a concept class, called masked-parity, that is efficiently learnable by interactive local algorithms but requires an exponential number of samples to be learned by noninteractive local algorithms. The paper also discusses the implications of these results, including the feasibility of private learning and the limitations of local algorithms. It concludes with a complexity-theoretic picture of learnable and privately learnable concept classes, showing that many concept classes can be learned privately with a polynomial number of samples and time.
Reach us at info@study.space