PyGAD-2.7.0
Changes in PyGAD 2.7.0 (11 September 2020):
- The
learning_rate
parameter in thepygad.nn.train()
function defaults to 0.01. - Added support of building neural networks for regression using the new parameter named
problem_type
. It is added as a parameter to bothpygad.nn.train()
andpygad.nn.predict()
functions. The value of this parameter can be either classification or regression to define the problem type. It defaults to classification. - The activation function for a layer can be set to the string
"None"
to refer that there is no activation function at this layer. As a result, the supported values for the activation function are"sigmoid"
,"relu"
,"softmax"
, and"None"
.
To build a regression network using the pygad.nn
module, just do the following:
- Set the
problem_type
parameter in thepygad.nn.train()
andpygad.nn.predict()
functions to the string"regression"
. - Set the activation function for the output layer to the string
"None"
. This sets no limits on the range of the outputs as it will be from-infinity
to+infinity
. If you are sure that all outputs will be nonnegative values, then use the ReLU function.
Check the documentation of the pygad.nn
module for an example that builds a neural network for regression. The regression example is also available at this GitHub project: https://github.com/ahmedfgad/NumPyANN
To build and train a regression network using the pygad.gann
module, do the following:
- Set the
problem_type
parameter in thepygad.nn.train()
andpygad.nn.predict()
functions to the string"regression"
. - Set the
output_activation
parameter in the constructor of thepygad.gann.GANN
class to"None"
.
Check the documentation of the pygad.gann
module for an example that builds and trains a neural network for regression. The regression example is also available at this GitHub project: https://github.com/ahmedfgad/NeuralGenetic
To build a classification network, either ignore the problem_type
parameter or set it to "classification"
(default value). In this case, the activation function of the last layer can be set to any type (e.g. softmax).