Hiding Identities: Estimation Under Local Differential Privacy
Deepesh Data, Antonious M. Girgis, Suhas DiggaviDate & Time
01:00 am – 01:00 am
In this paper, we study an estimation problem under the local differential privacy (LDP) framework: There is an ordered list of $d$ values (e.g., real numbers); a set of $n$ users, where each user observes an element from this list and each value in the list is observed by at least one user; and an untrusted server, who wants to estimate the values that the users possess, without learning (in the sense of LDP) the actual value that each user has and its corresponding index in the list. Towards this, we propose two LDP estimation schemes: The first one is under the assumption that the server knows the number of users that observe each value; and the second one is for the general scenario, in which the server does not have this prior information. We show that the minimax risk decreases with the total number of users under a very mild condition on the number of users observing each value.
01:00 am – 01:00 am
Information Privacy II
Mechanisms for Hiding Sensitive Genotypes with Information-Theoretic Privacy
Linear and Range Counting under Metric-based Local Differential Privacy
Hiding Identities: Estimation Under Local Differential Privacy
A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences
Privacy-Utility Tradeoff in a Guessing Framework Inspired by Index Coding