在PyTorch中,可以通过以下几种方式来调整学习率:
import torch.optim as optim
from torch.optim.lr_scheduler import StepLR
optimizer = optim.SGD(model.parameters(), lr=0.1)
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(num_epochs):
# Train the model
...
# Update learning rate
scheduler.step()
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch == 30:
for param_group in optimizer.param_groups:
param_group['lr'] = 0.01
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch % 10 == 0:
for param_group in optimizer.param_groups:
param_group['lr'] *= 0.1
以上是几种常见的调整学习率的方法,在训练神经网络时可以根据实际情况选择合适的方式调整学习率。
亿速云「云服务器」,即开即用、新一代英特尔至强铂金CPU、三副本存储NVMe SSD云盘,价格低至29元/月。点击查看>>
推荐阅读:pytorch调参学习率咋调整呢