📅  最后修改于: 2023-12-03 15:17:53.155000             🧑  作者: Mango
The nn.dropout
module in PyTorch provides a method to randomly zero some of the elements of the input tensor with a specified probability during training. This helps in preventing overfitting and improving generalization performance of the deep learning models.
torch.nn.Dropout(p: float = 0.5, inplace: bool = False)
p
: probability of an element to be zeroed. Default value is 0.5.inplace
: If set to True
, will do this operation in-place. Default value is False
.Example usage of nn.dropout
:
import torch.nn as nn
# Define a model
model = nn.Sequential(
nn.Linear(20, 512),
nn.Dropout(0.2), # randomly set 20% of elements to zero
nn.ReLU(inplace=True),
nn.Linear(512, 10),
)
# Forward pass
x = torch.randn(1, 20)
output = model(x)
Using nn.dropout
during training can provide several benefits such as:
Overall, nn.dropout
is a powerful tool for improving the performance and accuracy of deep learning models.