# nn.Dropout随机丢神经元的用法

pytorch与tensorflow中均有nn.dropout,两者之间的使用方法，下面将介绍。

# 一、torch.nn.dropout

r"""During training, randomly zeroes some of the elements of the input
tensor with probability :attr:p using samples from a Bernoulli
distribution. Each channel will be zeroed out independently on every forward
call.

This has proven to be an effective technique for regularization and
preventing the co-adaptation of neurons as described in the paper
Improving neural networks by preventing co-adaptation of feature
detectors_ .

Furthermore, the outputs are scaled by a factor of :math:\frac{1}{1-p} during
training. This means that during evaluation the module simply computes an
identity function.

## 1.1对于一维度向量的dropout

import torch
a = torch.randn(10)
p = 0.3 # probability of an element to be zeroed
Dropout = torch.nn.Dropout(p)
a1 = Dropout(a)
print(a)
print(a1)

由上图可以知道,：

0.p的作用：元素被清零的概率

1.红色框框部分：10个元素，其中有三个被置零，概率为0.3

2.紫色框框的元素：output = input/(1-p)

## 1.2对于二维卷积特征图的操作

import torch
a = torch.randn([2,2,3,4])
p = 0.3 # probability of an element to be zeroed
Dropout = torch.nn.Dropout(p)
a1 = Dropout(a)
print(a)
print(a1)

有兴趣可以多测试机组，按batch计算概率，通道之间累加，看图意会一下。

(0)