Skip to content

Applyed regularization techniques to improvise the performance of VAE Model such as L1/L2 Regularization (Weight Decay), Dropout, Batch Normalization, Beta-VAE (Modified KL Divergence Term), Data Augmentation

Notifications You must be signed in to change notification settings

prathamp25/Regularisation-in-VAE-Model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Regularisation-in-VAE-Model

Applyed regularization techniques and improvised the performance of VAE Model. such as L1/L2 Regularization (Weight Decay), Dropout, Batch Normalization, Beta-VAE (Modified KL Divergence Term), Data Augmentation Variable Auto Encoders: https://www.geeksforgeeks.org/variational-autoencoders/

Autoencoders are neural network architectures that are intended for the compression and reconstruction of data. It consists of an encoder and a decoder; these networks are learning a simple representation of the input data. Reconstruction loss ensures a close match of output with input, which is the basis for understanding more advanced architectures such as VAEs.

Dataset: CIFAR 10 Dataset

About

Applyed regularization techniques to improvise the performance of VAE Model such as L1/L2 Regularization (Weight Decay), Dropout, Batch Normalization, Beta-VAE (Modified KL Divergence Term), Data Augmentation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published