Skip to content

tmiethlinger/PyTorch-Multilayer_Perceptron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

PyTorch: Multilayer Perceptron

In this repo we implement a multilayer perceptron using PyTorch.

Overview

Multilayer perceptrons (MLPs), also call feedforward neural networks, are basic but flexible and powerful machine learning models which can be used for many different kinds of problems. I used this class many times for surrogate modeling problems in laser-plasma physics.

Basically, as long as the underlying data set is not too high-dimensional, MLPs can be a good start (e.g. images). Otherwise, MLPs tend to overfit.

Furthermore, if we can, e.g. due to physical considerations, expect a smooth dependent variable (also called response surface, or target function), we can use a sigmoid activation function (e.g., tanh) which has shown in my data sets better performance than the more common ReLU activation function.

In case you search for a different MLP implementation, check out scikit-learn.org.

Dependencies

  1. numpy
  2. torch
    • torch.optim
    • torch.nn

About

In this repo we implement a multilayer perceptron using PyTorch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages