本文正在参加「金石计划 . 分割6万现金大奖」

作者简介:秃头小苏,致力于用最通俗的语言描述问题

专栏引荐:深度学习网络原理与实战

近期目标:写好专栏的每一篇文章

支撑小苏:点赞、收藏⭐、留言

pytorch中Tensorboard的运用

写在前面

​  近期我计划不定期更新一些pytorch的教程,记录一些经常遇到的办法,避免每次遇到后重复搜索浪费时间,在之前我首要写过两篇pytorch教材,感兴趣的能够看一下,别离如下:

  • pytorch中的transforms.ToTensor和transforms.Normalize了解
  • pytorch模型保存、加载与续练习

​  本期为咱们带来Tensorborad的运用,同样,本篇基于Pytorch官网为咱们介绍,同时参加自己的了解,希望咱们阅读后能够有所收获。

​  准备好了的话,就让咱们开端吧!!!

导入相关包

​  首要咱们需求导入相关包,针对这节内容最重要的包就是SummaryWriter这个啦,咱们没有装置Tensorboard记得先装置喔,怎样装置就不用我教了吧,咱们动动勤快的小手儿百度一下,然后敲个指令就好咯。

# PyTorch model and training necessities
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
# Image datasets and image manipulation
import torchvision
import torchvision.transforms as transforms
# Image display
import matplotlib.pyplot as plt
import numpy as np
# PyTorch TensorBoard support
from torch.utils.tensorboard import SummaryWriter

加载数据集

​  本次运用的数据集是FashionMNIST数据集,它和MNIST数据集很像,都是单通道的2828巨细的图片,图片内容是各种衣服、鞋等一系列服饰。数据集加载的进程很简单,就不介绍了哈,不清楚这些的能够看看我往期的文章。需求留意的是官网运用DataLoader办法时num_workers设置为2,如果你运用CPU练习或许调试的话,请务必将num_workers设置为0。

# Gather datasets and prepare them for consumption
transform = transforms.Compose(
    [transforms.ToTensor(),
    transforms.Normalize((0.5,), (0.5,))])
# Store separate training and validations splits in ./data
training_set = torchvision.datasets.FashionMNIST('./data',
    download=True,
    train=True,
    transform=transform)
validation_set = torchvision.datasets.FashionMNIST('./data',
    download=True,
    train=False,
    transform=transform)
training_loader = torch.utils.data.DataLoader(training_set,
                                              batch_size=4,
                                              shuffle=True,
                                              num_workers=0)
validation_loader = torch.utils.data.DataLoader(validation_set,
                                                batch_size=4,
                                                shuffle=False,
                                                num_workers=0)

运用matplotlib可视化图片

​  咱们先来运用matplotlib展示一下图片,咱们将其保存到result文件夹下,能够来看一下成果。

# Extract a batch of 4 images
dataiter = iter(training_loader)
images, labels = next(dataiter)
# Create a grid from the images and show them
img_grid = torchvision.utils.make_grid(images)
torchvision.utils.save_image(img_grid, "./result/img_grid.bmp")

运用Tensorboard可视化图片

​  首要咱们需求通过SummaryWriter发动Tensorboard,后边的'runs/fashion_mnist_experiment_1'为文件保存路径,接着咱们运用add_image添加图片。flush()办法是为了确保文件写入磁盘。

# Default log_dir argument is "runs" - but it's good to be specific
# torch.utils.tensorboard.SummaryWriter is imported above
writer = SummaryWriter('runs/fashion_mnist_experiment_1')
# Write image data to TensorBoard log dir
writer.add_image('Four Fashion-MNIST Images', img_grid)
writer.flush()
# To view, start TensorBoard on the command line with:
#   tensorboard --logdir=runs
# ...and open a browser tab to http://localhost:6006/

​  履行完指令后,咱们能够在中止敲 tensorboard --logdir=runs,其中runs为保存的路径,如下图所示:

pytorch中Tensorboard的使用

​  此刻咱们进入http://localhost:6006/即可检查图片,如下图所示:

pytorch中Tensorboard的使用

运用Tensorboard可视化模型

​  首要咱们来创建模型一个简单的模型,如下:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 6, 5)
        self.pool = nn.MaxPool2d(2, 2)
        self.conv2 = nn.Conv2d(6, 16, 5)
        self.fc1 = nn.Linear(16 * 4 * 4, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)
    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = x.view(-1, 16 * 4 * 4)
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x
net = Net()
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)

  这里咱们的数据集用到仍然是FashionMNIST,可视化模型用到的函数为add_graph(),其余部分根本和可视化图片共同。

# Again, grab a single mini-batch of images
dataiter = iter(training_loader)
images, labels = next(dataiter)
writer = SummaryWriter('runs/fashion_mnist_experiment_1')
# add_graph() will trace the sample input through your model,
# and render it as a graph.
writer.add_graph(net, images)
writer.flush()

  此部分代码履行完成之后,改写一下http://localhost:6006/ 这个网址,会出现一个GRAPHS列,里边保存了刚刚定义网络的信息。咱们能够简单看一下,如下图所示:

pytorch中Tensorboard的使用

运用Tensorboard可视化丢失

​ 此阶段模型选用的和上一末节共同。直接上练习代码吧,如下:

print(len(validation_loader))
for epoch in range(1):  # loop over the dataset multiple times
    running_loss = 0.0
    for i, data in enumerate(training_loader, 0):
        # basic training loop
        inputs, labels = data
        optimizer.zero_grad()
        outputs = net(inputs)
        loss = criterion(outputs, labels)
        loss.backward()
        optimizer.step()
        running_loss += loss.item()
        if i % 1000 == 999:    # Every 1000 mini-batches...
            print('Batch {}'.format(i + 1))
            # Check against the validation set
            running_vloss = 0.0
            net.train(False) # Don't need to track gradents for validation
            for j, vdata in enumerate(validation_loader, 0):
                vinputs, vlabels = vdata
                voutputs = net(vinputs)
                vloss = criterion(voutputs, vlabels)
                running_vloss += vloss.item()
            net.train(True) # Turn gradients back on for training
            avg_loss = running_loss / 1000
            avg_vloss = running_vloss / len(validation_loader)
            # Log the running loss averaged per batch
            writer.add_scalars('Training vs. Validation Loss',
                            { 'Training' : avg_loss, 'Validation' : avg_vloss },
                            epoch * len(training_loader) + i)
            running_loss = 0.0
print('Finished Training')
writer.flush()

  保存练习丢失所用的函数为add_scalars,其余部分也大差不差,咱们此刻改写http://localhost:6006/ 网页,会得到练习丢失和验证丢失,如下图所示:

pytorch中Tensorboard的使用

小结

​  是不是发现这部分还挺好玩的呢,快去试试吧。最后我想说一下在jupyter notebook或许google clab中怎样发动Tensorboard,也很简单,分两步,如下:

  1. %load_ext tensorboard
  2. %tensorboard --logdir logs

​  履行上述两行代码就能够直接在jupyter notebook或许google clab运用Tensorboard啦,去尝试尝试吧!!!

如若文章对你有所协助,那就

        

pytorch中Tensorboard的使用