2019-07-30 13:12:22 -06:00
|
|
|
# BoltOn Subpackage
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-30 16:00:16 -06:00
|
|
|
This package contains source code for the BoltOn method, a particular
|
|
|
|
differential-privacy (DP) technique that uses output perturbations and
|
|
|
|
leverages additional assumptions to provide a new way of approaching the
|
|
|
|
privacy guarantees.
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-30 13:12:22 -06:00
|
|
|
## BoltOn Description
|
2019-07-22 09:27:53 -06:00
|
|
|
|
|
|
|
This method uses 4 key steps to achieve privacy guarantees:
|
|
|
|
1. Adds noise to weights after training (output perturbation).
|
2019-07-30 16:00:16 -06:00
|
|
|
2. Projects weights to R, the radius of the hypothesis space,
|
2019-07-30 13:12:22 -06:00
|
|
|
after each batch. This value is configurable by the user.
|
2019-07-22 09:27:53 -06:00
|
|
|
3. Limits learning rate
|
2019-07-31 14:40:30 -06:00
|
|
|
4. Uses a strongly convex loss function (see compile)
|
2019-07-22 09:27:53 -06:00
|
|
|
|
|
|
|
For more details on the strong convexity requirements, see:
|
|
|
|
Bolt-on Differential Privacy for Scalable Stochastic Gradient
|
2019-07-30 13:12:22 -06:00
|
|
|
Descent-based Analytics by Xi Wu et al. at https://arxiv.org/pdf/1606.04722.pdf
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-30 13:12:22 -06:00
|
|
|
## Why BoltOn?
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-30 13:12:22 -06:00
|
|
|
The major difference for the BoltOn method is that it injects noise post model
|
2019-07-25 08:38:37 -06:00
|
|
|
convergence, rather than noising gradients or weights during training. This
|
|
|
|
approach requires some additional constraints listed in the Description.
|
2019-07-22 09:27:53 -06:00
|
|
|
Should the use-case and model satisfy these constraints, this is another
|
|
|
|
approach that can be trained to maximize utility while maintaining the privacy.
|
2019-07-25 08:38:37 -06:00
|
|
|
The paper describes in detail the advantages and disadvantages of this approach
|
2019-07-22 09:27:53 -06:00
|
|
|
and its results compared to some other methods, namely noising at each iteration
|
|
|
|
and no noising.
|
|
|
|
|
2019-07-25 08:44:21 -06:00
|
|
|
## Tutorials
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-25 08:38:37 -06:00
|
|
|
This package has a tutorial that can be found in the root tutorials directory,
|
|
|
|
under `bolton_tutorial.py`.
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-25 08:44:21 -06:00
|
|
|
## Contribution
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-25 08:38:37 -06:00
|
|
|
This package was initially contributed by Georgian Partners with the hope of
|
|
|
|
growing the tensorflow/privacy library. There are several rich use cases for
|
2019-07-22 09:27:53 -06:00
|
|
|
delta-epsilon privacy in machine learning, some of which can be explored here:
|
|
|
|
https://medium.com/apache-mxnet/epsilon-differential-privacy-for-machine-learning-using-mxnet-a4270fe3865e
|
|
|
|
https://arxiv.org/pdf/1811.04911.pdf
|
|
|
|
|
2019-08-21 12:11:17 -06:00
|
|
|
## Stability
|
|
|
|
|
|
|
|
As we are pegged on tensorflow2.0.0, this package may encounter stability
|
|
|
|
issues in the ongoing development of this package.
|
|
|
|
|
|
|
|
We are aware of issues in model fitting using the BoltOnModel and are actively
|
|
|
|
working towards solving these issues.
|
|
|
|
|
2019-07-25 08:44:21 -06:00
|
|
|
## Contacts
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-25 08:38:37 -06:00
|
|
|
In addition to the maintainers of tensorflow/privacy listed in the root
|
|
|
|
README.md, please feel free to contact members of Georgian Partners. In
|
2019-07-22 09:27:53 -06:00
|
|
|
particular,
|
|
|
|
|
2019-07-25 08:38:37 -06:00
|
|
|
* Georgian Partners(@georgianpartners)
|
|
|
|
* Ji Chao Zhang(@Jichaogp)
|
|
|
|
* Christopher Choquette(@cchoquette)
|
2019-07-22 09:27:53 -06:00
|
|
|
|
2019-07-25 08:44:21 -06:00
|
|
|
## Copyright
|
2019-07-22 09:27:53 -06:00
|
|
|
|
|
|
|
Copyright 2019 - Google LLC
|