Skip to content

jeremy-jmc/ML_from_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ML_from_scratch

The motivation to work in this repository along this year is accomplish over 50% of the algorithms mentioned in this blog.

This repository is the version2 of this repo.

Hypothesis

A machine learning hypothesis is a candidate model that approximates a target function for mapping input to outputs.

Cost Function vs. Loss Functions

Cost function is a function that measures the performance of a ML model for given data, is basically the calculation of the error between predicted values and expected values. Cost Function is the average of error of n-sample in the data and Loss Function is the error for individual data points.

Gradient Descent

Is an optimization algorithm used to find the values of parameters(coefficients) of a function that minimizes a cost function.

This algorithm minimize the error of some function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient vector

Learning rate

The learning rate is a tuning parameter in an optimization algorithm that determines the step size at each iteration while moving toward a minimum of a cost function

Bibliography

  • UvA Machine Learning
  • Machine Learning Mastery
  • Daniel Bourke
  • Andrew Ng
  • Andrej Karpathy
  • Jeremy Jordan

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published