PyBrain 中的 TanhLayer 是什么
在本文中,我们将通过 PyBrain 中定义的 TanhLayer 示例来研究各种功能。 Pybrain 中的层是用于网络隐藏层的函数。 TanhLayer 执行tanh 压缩函数。
句法:
Import TanhLayer: from pybrain.structure import TanhLayer
Use in python code: net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
示例 1:
在此示例中,我们使用 import 命令导入 TanhLayer,以使用 buildNetwork() 创建具有输入层、隐藏层和输出层的网络。我们将一个隐藏类作为 TanhLayer,现在使用 SupervisedDataSet() 给出输入和输出数据集的大小。将样本数据集添加到 AND 表和 NOR 表。然后使用 BackpropTrainer() 训练这个网络。我们有 2500 次迭代,然后开始测试,我们可以看到错误、更正、最大值、错误等。在这种情况下,我们在 AND 表中获取的样本数据是 ((0,0), (0,)) 和((0,1),(1,)) 和 NOR 表是 ((0,0),(0,)) 和 (0,1),(1,))
Python
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
# two inputs, two hidden, and one output
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
# AND truth table
gate_set.addSample((0, 0), (0,))
gate_set.addSample((0, 1), (1,))
# NOR truth table
test_dataset.addSample((0, 0), (0,))
test_dataset.addSample((0, 1), (1,))
# Train the network with net and gate_set
backpr_tr = BackpropTrainer(net, gate_set)
# 2500 iteration
for i in range(2500):
backpr_tr.train()
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)
Python
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
# two inputs, two hidden, and one output
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
# AND truth table
gate_set.addSample((0, 0), (1,))
gate_set.addSample((0, 1), (1,))
# NOR truth table
test_dataset.addSample((0, 0), (0,))
test_dataset.addSample((0, 1), (0,))
#Train the network with net and gate_set
backpr_tr = BackpropTrainer(net, gate_set)
# 2500 iteration
for i in range(2500):
backpr_tr.train()
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)
输出:
示例 2:
在这个例子中,我们在AND表中取的样本数据是((0,0),(1,))和((0,1),(1,)),NOR表是((0,0), (0,)) 和 (0,1),(0,)),我们可以看到平均误差、最大误差、中值误差等的测试输出。
Python
from pybrain.tools.shortcuts import buildNetwork
from pybrain.structure import TanhLayer
from pybrain.datasets import SupervisedDataSet
from pybrain.supervised.trainers import BackpropTrainer
# two inputs, two hidden, and one output
net = buildNetwork(2, 3, 1, bias=True, hiddenclass=TanhLayer)
gate_set = SupervisedDataSet(2, 1)
test_dataset = SupervisedDataSet(2, 1)
# AND truth table
gate_set.addSample((0, 0), (1,))
gate_set.addSample((0, 1), (1,))
# NOR truth table
test_dataset.addSample((0, 0), (0,))
test_dataset.addSample((0, 1), (0,))
#Train the network with net and gate_set
backpr_tr = BackpropTrainer(net, gate_set)
# 2500 iteration
for i in range(2500):
backpr_tr.train()
# Testing....
backpr_tr.testOnData(dataset=test_dataset, verbose = True)
输出: