A comprehensive collection of machine learning algorithms implemented both from scratch and using popular libraries. Each implementation includes detailed explanations, mathematical concepts, and practical examples.
This repository aims to provide clear, well-documented implementations of machine learning algorithms to help understand their inner workings. Each algorithm is implemented twice:
- From scratch using NumPy (to understand the core concepts)
- Using popular libraries like scikit-learn (for practical applications)
-
Linear Regression
- Methods:
- Gradient Descent
- Normal Equation
- Simple Linear Regression
- Multiple Linear Regression
- Polynomial Regression
- Methods:
-
Gradient Descent
- Batch Gradient Descent
- Stochastic Gradient Descent
-
Neural Networks
- Neural Network from Scratch
-
Decision Tree
-
More algorithms coming soon:
- Logistic Regression
- Support Vector Machines
- K-means Clustering
- Naive Bayes
- Dimensionality Reduction
- Detailed Jupyter notebooks with step-by-step explanations
- Mathematical concepts and formulas
- Visualizations of algorithm behavior
- Performance comparisons
- Real-world examples and use cases
- Comprehensive documentation
- Python 3.8+
- NumPy
- Matplotlib
- scikit-learn
- Jupyter Notebook
- Clone the repository
- Install dependencies:
pip install -r requirements.txt
- Navigate to any algorithm folder
- Open the Jupyter notebooks to see implementations
Each algorithm folder contains:
- Theoretical explanation
- Step-by-step implementation
- Visualization of results
- Practical examples
- Performance evaluation
Contributions are welcome! Feel free to:
- Add new algorithms
- Improve existing implementations
- Add more examples
- Enhance documentation
This project is licensed under the MIT License - see the LICENSE file for details..