From bf9a58d26bbc71143aef9c874d3ccdc1533e5542 Mon Sep 17 00:00:00 2001 From: Nicholas Carlini Date: Tue, 14 Dec 2021 00:59:37 +0000 Subject: [PATCH] Add citation --- research/instahide_attack_2020/README.md | 17 ++++++++++++++++- 1 file changed, 16 insertions(+), 1 deletion(-) diff --git a/research/instahide_attack_2020/README.md b/research/instahide_attack_2020/README.md index 50f729a..3d782ff 100644 --- a/research/instahide_attack_2020/README.md +++ b/research/instahide_attack_2020/README.md @@ -1,6 +1,6 @@ Implementation of our reconstruction attack on InstaHide. -An Attack on InstaHide: Is Private Learning Possible with Instance Encoding? +Is Private Learning Possible with Instance Encoding? Nicholas Carlini, Samuel Deng, Sanjam Garg, Somesh Jha, Saeed Mahloujifar, Mohammad Mahmoody, Shuang Song, Abhradeep Thakurta, Florian Tramer https://arxiv.org/abs/2011.05315 @@ -49,3 +49,18 @@ To reproduce our results and run the attack, each of the files should be run in 6. Run `step_6_adjust_color.py`. Adjust the color curves to match. 7. Run `step_7_visualize.py`. Show the final resulting images. + +## Citation + +You can cite this attack at + +``` +@inproceedings{carlini2021private, + title={Is Private Learning Possible with Instance Encoding?}, + author={Carlini, Nicholas and Deng, Samuel and Garg, Sanjam and Jha, Somesh and Mahloujifar, Saeed and Mahmoody, Mohammad and Thakurta, Abhradeep and Tram{\`e}r, Florian}, + booktitle={2021 IEEE Symposium on Security and Privacy (SP)}, + pages={410--427}, + year={2021}, + organization={IEEE} +} +``` \ No newline at end of file