This repository is the collection of all the basics code notebooks of Machine Learning. It contains Notebooks right from the Simple Linear Regression to writing the Backpropogation of a Neural Network. Most of the code provided is written from scratch to demonstrate the working behind the Black-Box of the Machine Learning models of the various high-end libraries.
After providing the Implementation from Scratch using the basic libraries like NumPy, SciPy, Matplotlib, etc. I have demostrated the same task using the existing Machine Learning Models in high-end APIs like Sklearn, keras, etc.
Total Notebooks:
- Simple Linear Regression w/o Regularization (Using Gradient Descent as well as using Normal Equation )
- Simple Linear Regression w Regularization (Using Gradient Descent as well as using Normal Equation )
- Multiple Linear Regression w/o Regularization (Using Gradient Descent as well as using Normal Equation )
- Multiple Linear Regression w Regularization (Using Gradient Descent as well as using Normal Equation )
- Polynomial Regression
- K-Nearest Neighbour Algorithm
- Naive Bayes Algorithm - Gaussian
- Naive Bayes Algorithm - Multinomial
- Naive Bayes Algorithm - Bernouli
- Decision Trees - Using Sklearn
- Decision Tree from Scratch
- Support Vector Classification Algorithm
- Support Vector Regression Algorithm
- Perceptual Network (Implement AND and various Gates)
- Perceptual Network for Multiple Nodes
- Backpropogation in Neural Networks
- Using Grid Search for hyperparameter tuning
More Notebooks to be added to the list, Stay Tuned!