Pybrain claims to be the best Python neural network library. In fact, Scikit-Learn is known as the best machine learning library on Python, but it does not have a neural network, so it is not with me.
I have seen some references to the library of Neurolab before, and I plan to try it later (it seems that the supported neural network is not the same).
Pybrain's documentation is well written, but I don't need the examples. The examples given in the official documentation are for classification, not data fitting (predictive, or regression).
In addition, the official documentation of the function (method) is not complete, there are some need to call the python shell through the help function, or directly read the source code.
well, let's get back to business. Probably divided into the following steps.
Constructing a neural network
. Constructing a data set
Training neural networks
. Visualization of results
. Verification and analysis
You can use a model that quickly establishes a neural network, or you can set up a neural network yourself. Here, the second approach is adopted to establish a feedforward neural network.
From pybrain.structure import *
#建立神ç»ç½‘络 fnn
Fnn = FeedForwardNetwork()
# Set up three layers, one input layer (3 neurons, alias inLayer), one hidden layer, one output layer
inLayer = LinearLayer(3,)
hiddenLayer = SigmoidLayer(7,)
outLayer = LinearLayer(1,)
# Add all three layers to the neural network (ie join the neurons)
fnn.addInputModule(inLayer)
fnn.addModule(hiddenLayer)
fnn.addOutputModule(outLayer)
# Establish a connection between the three layers
In_to_hidden = FullConnecTIon(inLayer, hiddenLayer)
Hidden_to_out = FullConnecTIon(hiddenLayer, outLayer)
# Join the connection to the neural network
fnn.addConnecTIon(in_to_hidden)
fnn.addConnecTIon(hidden_to_out)
# 让 Neural network available
fnn.sortModules()
When constructing the dataset, I used SupervisedDataset, the supervised dataset. You can also try something else.
From pybrain.supervised.trainers import BackpropTrainer
# Define the format of the data set is three-dimensional input, one-dimensional output
DS = SupervisedDataSet(3,1)
# Add sample points to the data set
# assuming x1, x2, x3 are the three dimensional vectors of the input, y is the output vector, and they are the same length
For i in len(y):
DS.addSample([x1[i], x2[i], x3[i]], [y[i]])
# If you want to get the input / output inside, you can use
X = DS['input']
Y = DS['target']
# If you want to divide the data set into training set and test set, you can use the following statement, training set: test set = 8:2
# In order to facilitate subsequent calls, you can extract the input and output
dataTrain, dataTest = DS.splitWithProportion(0.8)
xTrain, yTrain = dataTrain['input'], dataTrain['target']
xTest, yTest = dataTest['input'], dataTest['target']
The construction dataset section has come to an end.
Training neural network As the saying goes, 80% of the work is often done in 20%. Uh huh, in fact, the most important code is the following lines.
But calling someone else's stuff, and not knowing the internal implementation ratio, is just a joke.
From pybrain.supervised.trainers import BackpropTrainer
# Trainer uses BP algorithm
# verbose = True, the Total error will be printed during training. The default training set and verification set in the library is 4:1, which can be changed in parentheses.
Trainer = BackpropTrainer(fnn, dataTrain, verbose = True, learningrate=0.01)
# maxEpochs is the maximum number of convergence iterations you need. The method used here is training to convergence. I usually set it to 1000.
trainer.trainUntilConvergence(maxEpochs=1000)
Data visualization is not mentioned, basically using Pylab for data visualization, see this blog post:
Some drawing functions of Python.
First, we can pick a random data to see the results.
Import random
#c is a random value from 0 to the length of xTest (including 0, excluding length)
c = random.randint(0, xTest.shape[0])
# X2 is a random sample point of xTest
X2 = xTest[c,:]
# activate function is the predicted output value of X2 after neural network training
Prediction = fnn.activate(X2)
# can print it out
Print('true number is: ' + str(yTest[c]),
'prediction number is:' + str(prediction),
'error:' + str((prediction-yTest[c])/yTest[c]))
We can print out the neural network, the code here is found in the stackoverflow, the source is forgotten, thanks to the buddy's wheel.
This way you can see the weight of each connection.
For mod in fnn.modules:
Print "Module:", mod.name
If mod.paramdim > 0:
Print "--parameters:", mod.params
For conn in fnn.connections[mod]:
Print "-connection to", conn.outmod.name
If conn.paramdim > 0:
Print "- parameters", conn.params
If hasattr(fnn, "recurrentConns"):
Print "Recurrent connections"
For conn in fnn.recurrentConns:
Print "-", conn.inmod.name, " to", conn.outmod.name
If conn.paramdim > 0:
Print "- parameters", conn.params
We can call a timer to see the running time of the program and judge the performance.
Import time
# Call this before the code that needs timing
Start = time.clock()
# Call the clock function again after the code that needs timing
Elapsed = (time.clock()-start)
Print("Time used:" + str(elapsed))
If you need some statistics, you can write some statistics function, or find the tools module in the package, there are some statistical functions, such as mean square error (MSE).
Features of AKG:1. Can be fixed on two kinds of guide rails2. Since it is inserted into the guide rail, the terminal pressing block is absolutely reliable3. The screws are not loose
Two kinds of holders made from insulating materials:1. One layer holder AB/SS (with screws) for one busbar.2. Double layer holder AB2/SS which are arranged staggered on both sides of the N-line and PE-line busbars.
Busbar With Screw Insulator,A Grade Screw Busbar,Busbar Insulators With Screws,Screw Busbar Composite Insulators
Wonke Electric CO.,Ltd. , https://www.wkdq-electric.com