A pure NumPy implementation of Linear Regression to understand ML fundamentals.
- R² Score: 0.9459 (94.59% accuracy)
- Final Loss (MSE): 673,399,536.67
- Dataset: 100 house samples (size, bedrooms → price)
- ✅ Pure NumPy implementation (no sklearn)
- ✅ Gradient descent optimization from scratch
- ✅ Feature normalization
- ✅ MSE loss function
- ✅ Visualization of results
pip install numpy matplotlibpython projects/main.py- Forward Pass:
y = X @ weights + bias - Calculate Loss: MSE =
mean((y_pred - y_actual)²) - Compute Gradients: Derivatives of loss
- Update Parameters:
weights -= learning_rate * gradient - Repeat until convergence
- NumPy array operations and matrix multiplication
- Gradient descent algorithm
- Why we square errors in loss functions
- Feature normalization importance
- Model evaluation with R² score
Built while learning machine learning fundamentals from scratch!
MIT License - feel free to use for learning!
