PiperOrigin-RevId: 239031260
This commit is contained in:
Nicolas Papernot 2019-03-18 11:54:41 -07:00 committed by A. Unique TensorFlower
parent a9840529c4
commit f58891f3e3
2 changed files with 22 additions and 3 deletions

View file

@ -67,7 +67,8 @@ Github pull requests. To speed the code review process, we ask that:
To help you get started with the functionalities provided by this library, the
`tutorials/` folder comes with scripts demonstrating how to use the library
features.
features. The list of tutorials is described in the README included in the
tutorials directory.
NOTE: the tutorials are maintained carefully. However, they are not considered
part of the API and they can change at any time without warning. You should not

View file

@ -1,10 +1,28 @@
# Tutorials
This folder contains a set of tutorials that demonstrate the features of this
library.
As demonstrated on MNIST in `mnist_dpsgd_tutorial.py`, the easiest way to use
a differentially private optimizer is to modify an existing training loop
a differentially private optimizer is to modify an existing TF training loop
to replace an existing vanilla optimizer with its differentially private
counterpart implemented in the library.
Here is a list of all the tutorials included:
* `lm_dpsgd_tutorial.py`: learn a language model with differential privacy.
* `mnist_dpsgd_tutorial.py`: learn a convolutional neural network on MNIST with
differential privacy.
* `mnist_dpsgd_tutorial_eager.py`: learn a convolutional neural network on MNIST
with differential privacy using Eager mode.
* `mnist_dpsgd_tutorial_keras.py`: learn a convolutional neural network on MNIST
with differential privacy using tf.Keras.
The rest of this README describes the different parameters used to configure
DP-SGD as well as expected outputs for the `mnist_dpsgd_tutorial.py` tutorial.
## Parameters
All of the optimizers share some privacy-specific parameters that need to