# Inverting Gradients - How easy is it to break Privacy in Federated Learning?
---------------------
**Update Feb 2022: A modernized implementation of this attack (and many other attacks) is included in our newest framework for privacy attacks in FL:**
**https://github.com/JonasGeiping/breaching**
---------------------
This repository is an implementation of the reconstruction algorithm discussed in
```
Jonas Geiping, Hartmut Bauermeister, Hannah Dröge, and Michael Moeller.
Inverting Gradients -- How Easy Is It to Break Privacy in Federated Learning?,
March 31, 2020.
https://arxiv.org/abs/2003.14053v1.
```
which can be found at https://arxiv.org/abs/2003.14053
Input Image | Reconstruction from gradient information
:-------------------------:|:-------------------------:
![](11794_ResNet18_ImageNet_input.png) | ![](11794_ResNet18_ImageNet_output.png)
[Model: standard ResNet18, trained on ImageNet data. The image is from the validation set.]
### Abstract:
The idea of federated learning is to collaboratively train a neural network on a server. Each user receives the current weights of the network and in turns sends parameter updates (gradients) based on local data. This protocol has been designed not only to train neural networks data-efficiently, but also to provide privacy benefits for users, as their in-put data remains on device and only parameter gradients are shared. In this paper we show that sharing parameter gradients is by no means secure: By exploiting a cosine similarity loss along with optimization methods from adversarial attacks, we are able to faithfully reconstruct images at high resolution from the knowledge of their parameter gradients, and demonstrate that such a break of privacy is possible even for trained deep networks. Moreover, we analyze the effects of architecture as well as parameters on the difficulty of reconstructing the input image, prove that any input to a fully connected layer can be reconstructed analytically independent of the remaining architecture, and show numerically that even averaging gradients over several iterations or several images does not protect the user’s privacy in federated learning applications in computer vision.
## Code
The central file that contains the reconstruction algorithm can be found at ```inversefed/reconstruction_algorithms.py```. The other folders and files are used to define and train the various models and are not central for recovery.
### Setup:
Requirements:
```
pytorch=1.4.0
torchvision=0.5.0
```
You can use [anaconda](https://www.anaconda.com/distribution/) to install our setup by running
```
conda env create -f environment.yml
conda activate iv
```
To run ImageNet experiments, you need to download ImageNet and provide its location [or use your own images and skip the ```inversefed.construct_dataloaders``` steps].
### Quick Start
Usage examples can be found in the notebooks, for example the [ResNet-152, ImageNet](ResNet152%20-%20trained%20on%20ImageNet.ipynb) example.
Given an input gradient (as computed by e.g. ```torch.autograd.grad```), a ```config``` dictionary, a model ```model``` and dataset mean and std, ```(dm, ds)```, build the reconstruction operator
```
rec_machine = inversefed.GradientReconstructor(model, (dm, ds), config, num_images=1)
```
and then start the reconstruction, specifying a target image size:
```
output, stats = rec_machine.reconstruct(input_gradient, None, img_shape=(3, 32, 32))
```
### CLI Usage example:
The code can also be used via cmd-line in the following way:
```
python reconstruct_image.py --model ResNet20-4 --dataset CIFAR10 --trained_model --cost_fn sim --indices def --restarts 32 --save_image --target_id -1
```
没有合适的资源?快使用搜索试试~ 我知道了~
通过神经网络从梯度信号中恢复输入数据的算法_Jupyter Notebook_Python_下载.zip
共44个文件
py:24个
md:7个
ipynb:7个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 43 浏览量
2023-04-27
10:51:04
上传
评论
收藏 2.39MB ZIP 举报
温馨提示
通过神经网络从梯度信号中恢复输入数据的算法_Jupyter Notebook_Python_下载.zip
资源推荐
资源详情
资源评论
收起资源包目录
通过神经网络从梯度信号中恢复输入数据的算法_Jupyter Notebook_Python_下载.zip (44个子文件)
invertinggradients-master
ResNet18 - untrained (ImageNet version).ipynb 361KB
ResNet32-10 - Recovering 100 CIFAR-100 images.ipynb 1.07MB
Recovery from Weight Updates.ipynb 93KB
auto.jpg 72KB
LICENSE 1KB
inversefed
utils.py 2KB
__init__.py 632B
nn
__init__.py 173B
models.py 15KB
modules.py 4KB
revnet_utils.py 5KB
densenet.py 3KB
revnet.py 7KB
README.md 41B
metrics.py 4KB
consts.py 573B
medianfilt.py 2KB
data
__init__.py 141B
loss.py 3KB
data.py 4KB
datasets.py 2KB
data_processing.py 8KB
README.md 70B
optimization_strategy.py 2KB
reconstruction_algorithms.py 18KB
options.py 3KB
training
__init__.py 108B
training_routine.py 4KB
scheduler.py 4KB
README.md 40B
ResNet152 - trained on ImageNet.ipynb 383KB
reconstruct_image.py 11KB
environment.yml 3KB
11794_ResNet18_ImageNet_output.png 55KB
Recovery from Gradient Information.ipynb 90KB
models
README.md 40B
.gitignore 2KB
images
README.md 42B
Multiple images and multiple local update steps (ConvNet).ipynb 161KB
tables
README.md 48B
rec_mult.py 14KB
README.md 4KB
ResNet18 - trained on ImageNet.ipynb 790KB
11794_ResNet18_ImageNet_input.png 73KB
共 44 条
- 1
资源评论
快撑死的鱼
- 粉丝: 1w+
- 资源: 9154
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功