Skip to content

moronic09/numpy-linear-regression_MachineLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NumPy Linear Regression from Scratch 🚀

A pure NumPy implementation of Linear Regression to understand ML fundamentals.

📊 Results

  • R² Score: 0.9459 (94.59% accuracy)
  • Final Loss (MSE): 673,399,536.67
  • Dataset: 100 house samples (size, bedrooms → price)

Training Results

🚀 Features

  • ✅ Pure NumPy implementation (no sklearn)
  • ✅ Gradient descent optimization from scratch
  • ✅ Feature normalization
  • ✅ MSE loss function
  • ✅ Visualization of results

📦 Installation

pip install numpy matplotlib

💻 Usage

python projects/main.py

🧮 How It Works

  1. Forward Pass: y = X @ weights + bias
  2. Calculate Loss: MSE = mean((y_pred - y_actual)²)
  3. Compute Gradients: Derivatives of loss
  4. Update Parameters: weights -= learning_rate * gradient
  5. Repeat until convergence

📚 What I Learned

  • NumPy array operations and matrix multiplication
  • Gradient descent algorithm
  • Why we square errors in loss functions
  • Feature normalization importance
  • Model evaluation with R² score

👨‍💻 Author

Built while learning machine learning fundamentals from scratch!

📝 License

MIT License - feel free to use for learning!

About

Linear Regression from scratch using NumPy - R² score: 0.95

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages