Video Discription |
Transform your data science skills using 36 Regression Machine Learning Algorithms using ScikitLearn and XGBoost python libraries in this ML end-to-end project on energy efficiency data of houses (heating load and cooling load), including AI Explainability with SHAP Python Library. A Hands-on Tutorial for learning AI Algorithms like random forest regression and building your data science portfolio with full Python code included in Google Colab Notebook.
====================
👨💻Do you want to become a End to End Data Scientist / Full Stack Data Scientist?
➊ GET MY FREE Data Science Guide - Data Science Roadmap to Excellence including 100 Python Libraries for Impactful Machine Learning and Deep Learning
⋙ Grab your Free guide here: https://www.maryammiradi.com/free-guide
➋ JOIN MY END TO END Data Science TRAINING PROGRAM?
⋙ Check out My 4-Weeks Hands-On Program here: https://www.maryammiradi.com/training
====================
🎥 Suggested Videos:
Classification Data Science Project End to End: 9 Ensemble Learning Methods + Hyperparameter Tunning with Bayesian Search [2024]:
https://youtu.be/i1L7qAV-_rY
====================
▶ Send me direct messages on Linkedin: https://www.linkedin.com/in/maryammiradi/
====================
📚 Link to Python Code in Google Colab:
https://colab.research.google.com/drive/1PeS-aXkRsirYda1Nt0ch91y5zAWjsUnd?usp=sharing
====================
⏱️⏱️VIDEO CHAPTERS⏱️⏱️
0:00 Introduction
0:10 Loading Data & Libraries in Google COLAB for 8 Groups of ScikitLearn Models (Generalized Linear Models, Support Vector Machines, Nearest Neighbors, Decision Trees and Ensemble Methods,
Gaussian Processes, Naive Bayes, Discriminant Analysis and Neural Networks)
3:00 Explanation of Energy Efficiency
4:12 Data Exploration of Input features and Target Features using Pandas
5:19 Model Building and Comparison with 36 SKlearn Models being, Multivariable Linear Regression, Ridge, Lasso, ElasticNet, LARS, LassoLARS, OMP, Bayesian Ridge, ARD Regression, SGD, Passive Aggressive Regressor
Huber, Quantile Regressor, Support Vector Regression, NuSVR, LinearSVR, K Nearest Neighbors, Radius Neighbors, Decision Tree, Random Forest, Extra Trees Regressors, Gradient Boosting, ADABoost, Bagging Regressors Voting, Stacking Regressors, XGBoost Regressor
Gaussian Process Regressor, Naive Bayes (Gaussian, Multinomial, Bernoulli, Complement, Categorical), Discriminant Analysis (Linear, Quadratic) and MLP Regressor
15:16 Model Comparison using k fold cross validation for model selection
20:48 Build Final Model with XGBoost Python Library
22:18 Test Results to Final Model Trained with XGBoost Algorithm
22:39 AI Explainability with SHAP Values (Global Explainbility)
30:16 Outro
__________
✍️ Leave any questions you have about AI & Data Science in the comments! [PI4qIe3iBrw] |