Multiple Linear Regression with Gradient Descent from Scratch

Channel:
Subscribers:
33,600
Published on ● Video Link: https://www.youtube.com/watch?v=O96hzKRx3O4



Duration: 0:00
345 views
20


In this episode of the Machine Learning fundamentals series, I show you in more detail how linear regression models work, how they are trained including the Normal Equation and Gradient Descent, and finally how to code all of this from scratch to get a deeper understanding. #machinelearning #ml

Colab Notebooks: https://drive.google.com/drive/folders/16fu-B4-Iz2GZdcSe0mIYJ739tk1gNCWw?usp=sharing
Join the Community! - https://discord.gg/cortexdev

Want to Support the Channel?
- Become a Member:
- https://buymeacoffee.com/kodysimpson

My Socials:
Github: https://github.com/KodySimpson
Instagram: https://www.instagram.com/kody_a_simpson/
Twitter: https://twitter.com/kodysimp
Blog: https://simpson.hashnode.dev/

Timestamps:
0:00:00 - How Linear Regression Works
0:26:35 - Implementing the Normal Equation
0:33:57 - Gradient Descent
0:53:03 - Vectorized Gradient Descent

More Videos coming soon.
Leave a comment for any future video suggestions.