📜  如何使用 PyBrain 中的训练器训练网络

📅  最后修改于: 2022-05-13 01:54:35.089000             🧑  作者: Mango

如何使用 PyBrain 中的训练器训练网络

在本文中,我们将讨论如何使用 PyBrain 中的训练器训练网络。

网络:一个网络由几个模块组成。这些模块一般用连接件连接。 PyBrain 为程序员提供了神经网络的支持。网络可以解释为无环有向图,其中每个模块都服务于顶点/节点的目的,并且连接被假定为边缘。

数据集:数据集是为在网络上进行测试、验证和训练而传递的数据集合。与数组相比,数据集更灵活且易于使用。它与命名二维数组的集合非常相似。机器学习中的每个任务都需要特定的数据集。

在 PyBrain 中使用训练器训练网络:

PyBrain 提供了两个训练器来测试网络。这些网络将在下面讨论,

1. 反向传播训练器:

该训练器用于根据分类数据集或受监控的数据集(随着时间的推移)向后传播错误来训练模块的参数。

2. TrainUntilConvergence:

TrainUntilConvergence 专门用于在数据集上训练模块,除非它收敛。在开发神经网络时,网络是根据传递的数据进行训练的。确定网络是否经过显着训练取决于在该网络上测试的测试数据的预测。

例子:

在此示例中,我们创建了两个数据集类型 SupervisedDataSet。我们正在使用下面给出的 NAND 数据模型:

A  BA NAND B
00     1
01     1
1     1
11     0

用于测试的数据集如下:

Python3
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (1,))
nand_train.addSample((1, 1), (1,))
nand_train.addSample((1, 1), (0,))


Python3
# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 100 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose = True)


Python3
# Python program to demonstrate how to train
# a network
  
# Importing libraries and packages
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# Establishing a network having two inputs,
# four hidden, and one output channels
network = buildNetwork(2, 4, 1, bias=True, hiddenclass=TanhLayer)
  
# Creating a dataset that match with the 
# network input and output sizes
nand_gate = SupervisedDataSet(2, 1)
  
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_gate.addSample((0, 0), (1,))
nand_gate.addSample((0, 1), (1,))
nand_gate.addSample((1, 0), (1,))
nand_gate.addSample((1, 1), (0,))
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (1,))
nand_train.addSample((1, 1), (1,))
nand_train.addSample((1, 1), (0,))
  
# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 10 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose=True)


Python3
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
  
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (0,))
nand_train.addSample((1, 0), (1,))
nand_train.addSample((1, 1), (0,))


Python3
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# Establishing a network having two inputs,
# four hidden, and one output channels
network = buildNetwork(2, 4, 1, bias=True, hiddenclass=TanhLayer)
  
# Creating a dataset that match with the
# network input and output sizes
nand_gate = SupervisedDataSet(2, 1)
  
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_gate.addSample((0, 0), (1,))
nand_gate.addSample((0, 1), (0,))
nand_gate.addSample((1, 0), (1,))
nand_gate.addSample((1, 1), (0,))
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (1,))
nand_train.addSample((1, 0), (1,))
nand_train.addSample((1, 1), (0,))
  
# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 10 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose=True)


使用的教练如下:

Python3

# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 100 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose = True)

例子:

Python3

# Python program to demonstrate how to train
# a network
  
# Importing libraries and packages
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# Establishing a network having two inputs,
# four hidden, and one output channels
network = buildNetwork(2, 4, 1, bias=True, hiddenclass=TanhLayer)
  
# Creating a dataset that match with the 
# network input and output sizes
nand_gate = SupervisedDataSet(2, 1)
  
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_gate.addSample((0, 0), (1,))
nand_gate.addSample((0, 1), (1,))
nand_gate.addSample((1, 0), (1,))
nand_gate.addSample((1, 1), (0,))
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (1,))
nand_train.addSample((1, 1), (1,))
nand_train.addSample((1, 1), (0,))
  
# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 10 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose=True)

输出:

解释:正如您在输出中看到的,测试数据与已使用的数据集匹配,因此误差仅为 0.021。

现在,让我们更改数据并再次运行程序。

Python3

# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
  
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (0,))
nand_train.addSample((1, 0), (1,))
nand_train.addSample((1, 1), (0,))

例子:

Python3

from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
  
# Establishing a network having two inputs,
# four hidden, and one output channels
network = buildNetwork(2, 4, 1, bias=True, hiddenclass=TanhLayer)
  
# Creating a dataset that match with the
# network input and output sizes
nand_gate = SupervisedDataSet(2, 1)
  
# Creating a dataset for testing
nand_train = SupervisedDataSet(2, 1)
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_gate.addSample((0, 0), (1,))
nand_gate.addSample((0, 1), (0,))
nand_gate.addSample((1, 0), (1,))
nand_gate.addSample((1, 1), (0,))
  
# Fit input and target values to dataset
# Parameters for nand_train truth table
nand_train.addSample((0, 0), (1,))
nand_train.addSample((0, 1), (1,))
nand_train.addSample((1, 0), (1,))
nand_train.addSample((1, 1), (0,))
  
# Training the network with dataset nand_gate
trainer = BackpropTrainer(network, nand_gate)
  
# Iterate 10 times to train the network
for epoch in range(100):
    trainer.train()
    trainer.testOnData(dataset=nand_train, verbose=True)

输出:

解释:

正如您在输出中看到的,测试数据与网络训练器不完全匹配,因此平均误差为 0.129,大于前面的示例。