Skip to content

PyGAD-2.7.0

Compare
Choose a tag to compare
@ahmedfgad ahmedfgad released this 11 Sep 18:12
· 425 commits to master since this release
0342b13

Changes in PyGAD 2.7.0 (11 September 2020):

  1. The learning_rate parameter in the pygad.nn.train() function defaults to 0.01.
  2. Added support of building neural networks for regression using the new parameter named problem_type. It is added as a parameter to both pygad.nn.train() and pygad.nn.predict() functions. The value of this parameter can be either classification or regression to define the problem type. It defaults to classification.
  3. The activation function for a layer can be set to the string "None" to refer that there is no activation function at this layer. As a result, the supported values for the activation function are "sigmoid", "relu", "softmax", and "None".

To build a regression network using the pygad.nn module, just do the following:

  1. Set the problem_type parameter in the pygad.nn.train() and pygad.nn.predict() functions to the string "regression".
  2. Set the activation function for the output layer to the string "None". This sets no limits on the range of the outputs as it will be from -infinity to +infinity. If you are sure that all outputs will be nonnegative values, then use the ReLU function.

Check the documentation of the pygad.nn module for an example that builds a neural network for regression. The regression example is also available at this GitHub project: https://github.com/ahmedfgad/NumPyANN

To build and train a regression network using the pygad.gann module, do the following:

  1. Set the problem_type parameter in the pygad.nn.train() and pygad.nn.predict() functions to the string "regression".
  2. Set the output_activation parameter in the constructor of the pygad.gann.GANN class to "None".

Check the documentation of the pygad.gann module for an example that builds and trains a neural network for regression. The regression example is also available at this GitHub project: https://github.com/ahmedfgad/NeuralGenetic

To build a classification network, either ignore the problem_type parameter or set it to "classification" (default value). In this case, the activation function of the last layer can be set to any type (e.g. softmax).