Dropout
torch.nn.Dropout(p=0.5, inplace=False)
- p – probability of an element to be zeroed. Default: 0.5
- inplace – If set to
True, will do this operation in-place. Default:False
训练过程中以概率P随机的将参数置0,其中P为置0的概率,例如P=1表示将网络参数全部置0
During training, randomly zeroes some of the elements of the input tensor with probability
pusing samples from a Bernoulli distribution. Each channel will be zeroed out independently on every forward call.
注意: Pytorch文档中给出了一点,输出的参数会以进行一个缩放
Furthermore, the outputs are scaled by a factor of
during training. This means that during evaluation the module simply computes an identity function.
下面例子展示出在dropout之后,参数变为了原来的倍
1 | input = torch.tensor([[1, 2, 3], |
当我们把nn.Dropout的inplace=True时,计算的结果就会替换掉原来的输入input,如下:
1 | input = torch.tensor([[1, 2, 3], |
训练与测试的不同
在训练和测试的时候,nn.Dropout的表现是不同的,在训练时nn.Dropout会以概率p随机的丢弃一些神经元,但是在测试时,所有神经元都不会被丢弃,如下
1 | import torch |