Category: Programming
-
Building Strong GLMs in Python via ML + XAI
We use Python to craft a strong GLM by insights from a boosted trees model.
-
ML + XAI -> Strong GLM
In this post, we improve a simple GLM by insights from a boosted trees model.
-
Explain that tidymodels blackbox!
In this post you will learn how to explain a {tidymodels} blackbox with classic XAI and SHAP.
-
Permutation SHAP versus Kernel SHAP
When do the two methods agree? When not?
-
Interactions – where are you?
This question sends shivers down the poor modelers spine… The {hstats} R package introduced in our last post measures their strength using Friedman’s H-statistics, a collection of statistics based on partial dependence functions. On Github, the preview version of {hstats} 1.0.0 out – I will try to bring it to CRAN in about one week…
-
It’s the interactions
What makes a ML model a black-box? It is the interactions. Without any interactions, the ML model is additive and can be exactly described. Studying interaction effects of ML models is challenging. The main XAI approaches are: This post is mainly about the third approach. Its beauty is that we get information about all interactions.…
-
Model Diagnostics in Python
Version 1.0.0 of the new Python package for model-diagnostics was just released on PyPI.
-
Geographic SHAP
“R Python” continued… Geographic SHAP
-
SHAP + XGBoost + Tidymodels = LOVE
tidymodels and shapviz to explain XGBoost models
-
Dplyr-style without dplyr
How to get “dplyr” feeling without “dplyr”
-
Interpret Complex Linear Models with SHAP within Seconds
Peaking into richly parametrized linear models with SHAP? Yes!
-
Histograms, Gradient Boosted Trees, Group-By Queries and One-Hot Encoding
This post shows how filling histograms can be done in very different ways thereby connecting very different areas: from gradient boosted trees to SQL queries to one-hot encoding. Let’s jump into it! Modern gradient boosted trees (GBT) like LightGBM, XGBoost and the HistGradientBoostingRegressor of scikit-learn all use two techniques on top of standard gradient boosting:…