forked from 626_privacy/tensorflow_privacy
readme fixes - more
This commit is contained in:
parent
d0ef1b380c
commit
fe90e3c596
1 changed files with 6 additions and 6 deletions
|
@ -5,7 +5,7 @@ of methods used in the ensuring privacy in machine learning that leverages
|
||||||
additional assumptions to provide a new way of approaching the privacy
|
additional assumptions to provide a new way of approaching the privacy
|
||||||
guarantees.
|
guarantees.
|
||||||
|
|
||||||
# Bolton Description
|
## Bolton Description
|
||||||
|
|
||||||
This method uses 4 key steps to achieve privacy guarantees:
|
This method uses 4 key steps to achieve privacy guarantees:
|
||||||
1. Adds noise to weights after training (output perturbation).
|
1. Adds noise to weights after training (output perturbation).
|
||||||
|
@ -17,7 +17,7 @@ For more details on the strong convexity requirements, see:
|
||||||
Bolt-on Differential Privacy for Scalable Stochastic Gradient
|
Bolt-on Differential Privacy for Scalable Stochastic Gradient
|
||||||
Descent-based Analytics by Xi Wu et al.
|
Descent-based Analytics by Xi Wu et al.
|
||||||
|
|
||||||
# Why Bolton?
|
## Why Bolton?
|
||||||
|
|
||||||
The major difference for the Bolton method is that it injects noise post model
|
The major difference for the Bolton method is that it injects noise post model
|
||||||
convergence, rather than noising gradients or weights during training. This
|
convergence, rather than noising gradients or weights during training. This
|
||||||
|
@ -28,12 +28,12 @@ The paper describes in detail the advantages and disadvantages of this approach
|
||||||
and its results compared to some other methods, namely noising at each iteration
|
and its results compared to some other methods, namely noising at each iteration
|
||||||
and no noising.
|
and no noising.
|
||||||
|
|
||||||
# Tutorials
|
## Tutorials
|
||||||
|
|
||||||
This package has a tutorial that can be found in the root tutorials directory,
|
This package has a tutorial that can be found in the root tutorials directory,
|
||||||
under `bolton_tutorial.py`.
|
under `bolton_tutorial.py`.
|
||||||
|
|
||||||
# Contribution
|
## Contribution
|
||||||
|
|
||||||
This package was initially contributed by Georgian Partners with the hope of
|
This package was initially contributed by Georgian Partners with the hope of
|
||||||
growing the tensorflow/privacy library. There are several rich use cases for
|
growing the tensorflow/privacy library. There are several rich use cases for
|
||||||
|
@ -41,7 +41,7 @@ delta-epsilon privacy in machine learning, some of which can be explored here:
|
||||||
https://medium.com/apache-mxnet/epsilon-differential-privacy-for-machine-learning-using-mxnet-a4270fe3865e
|
https://medium.com/apache-mxnet/epsilon-differential-privacy-for-machine-learning-using-mxnet-a4270fe3865e
|
||||||
https://arxiv.org/pdf/1811.04911.pdf
|
https://arxiv.org/pdf/1811.04911.pdf
|
||||||
|
|
||||||
# Contacts
|
## Contacts
|
||||||
|
|
||||||
In addition to the maintainers of tensorflow/privacy listed in the root
|
In addition to the maintainers of tensorflow/privacy listed in the root
|
||||||
README.md, please feel free to contact members of Georgian Partners. In
|
README.md, please feel free to contact members of Georgian Partners. In
|
||||||
|
@ -51,6 +51,6 @@ particular,
|
||||||
* Ji Chao Zhang(@Jichaogp)
|
* Ji Chao Zhang(@Jichaogp)
|
||||||
* Christopher Choquette(@cchoquette)
|
* Christopher Choquette(@cchoquette)
|
||||||
|
|
||||||
# Copyright
|
## Copyright
|
||||||
|
|
||||||
Copyright 2019 - Google LLC
|
Copyright 2019 - Google LLC
|
||||||
|
|
Loading…
Reference in a new issue