Much of the early literature assumes a trusted data curator, who collects data in the clear, aggregates it, and, as the final step, adds noise to the answer of the query.(a.k.a., global differential privacy.). However, such a trusted third party may not exist in most cases. Some recent works proposes a model called local differential privacy(LDP), where the noises is added locally by each user before the contributions are collected by the aggregator. Although LDP is clearly better, it also introduces a tradeoff between privacy and utility. (Better Privacy guarantee vs. more accurate answers).