diff --git a/README.md b/README.md index 0bd4dbd..f362ecd 100644 --- a/README.md +++ b/README.md @@ -67,13 +67,14 @@ Github pull requests. To speed the code review process, we ask that: To help you get started with the functionalities provided by this library, the `tutorials/` folder comes with scripts demonstrating how to use the library -features. +features. The list of tutorials is described in the README included in the +tutorials directory. NOTE: the tutorials are maintained carefully. However, they are not considered part of the API and they can change at any time without warning. You should not write 3rd party code that imports the tutorials and expect that the interface will not break. - + ## Research directory This folder contains code to reproduce results from research papers related to diff --git a/tutorials/README.md b/tutorials/README.md index ce2809f..94b5cef 100644 --- a/tutorials/README.md +++ b/tutorials/README.md @@ -1,10 +1,28 @@ # Tutorials +This folder contains a set of tutorials that demonstrate the features of this +library. As demonstrated on MNIST in `mnist_dpsgd_tutorial.py`, the easiest way to use -a differentially private optimizer is to modify an existing training loop +a differentially private optimizer is to modify an existing TF training loop to replace an existing vanilla optimizer with its differentially private counterpart implemented in the library. +Here is a list of all the tutorials included: + +* `lm_dpsgd_tutorial.py`: learn a language model with differential privacy. + +* `mnist_dpsgd_tutorial.py`: learn a convolutional neural network on MNIST with + differential privacy. + +* `mnist_dpsgd_tutorial_eager.py`: learn a convolutional neural network on MNIST + with differential privacy using Eager mode. + +* `mnist_dpsgd_tutorial_keras.py`: learn a convolutional neural network on MNIST + with differential privacy using tf.Keras. + +The rest of this README describes the different parameters used to configure +DP-SGD as well as expected outputs for the `mnist_dpsgd_tutorial.py` tutorial. + ## Parameters All of the optimizers share some privacy-specific parameters that need to