← Back to glossary

Differential Privacy

Definition

A mathematical privacy standard that guarantees that adding or removing any single record changes the result of a computation only by a clearly quantified amount (epsilon, ε). Implemented by adding controlled noise to data, intermediate results or model updates.

Noise — Signal

Differential privacy is cited as "the gold standard for data protection" — which is correct, but rarely understood in the way the term is meant. The protection is parameterised: a high epsilon offers little protection and high data utility; a low epsilon offers strong protection but heavily noised results. In practice, the epsilon values used in vendor products are often so large that the protection against real adversaries is marginal. DP also doesn't defend against every threat — against an attacker with background knowledge of the data structure, the guarantees are weaker than communicated.

The right question

Not: "Are we using differential privacy?" But: "At what epsilon, against which threat model, with what measured impact on data utility — and which additional measures complement DP, because on its own it isn't enough?"

← Back to glossary