Note
Click here to download the full example code
Model ExplanationΒΆ
Out:
Running DummyClassifier(strategy='prior')
accuracy: 0.391 recall_macro: 0.333 precision_macro: 0.130 f1_macro: 0.187
=== new best DummyClassifier(strategy='prior') (using recall_macro):
accuracy: 0.391 recall_macro: 0.333 precision_macro: 0.130 f1_macro: 0.187
Running GaussianNB()
accuracy: 0.970 recall_macro: 0.972 precision_macro: 0.972 f1_macro: 0.970
=== new best GaussianNB() (using recall_macro):
accuracy: 0.970 recall_macro: 0.972 precision_macro: 0.972 f1_macro: 0.970
Running MultinomialNB()
accuracy: 0.932 recall_macro: 0.935 precision_macro: 0.942 f1_macro: 0.936
Running DecisionTreeClassifier(class_weight='balanced', max_depth=1)
accuracy: 0.557 recall_macro: 0.602 precision_macro: 0.417 f1_macro: 0.473
Running DecisionTreeClassifier(class_weight='balanced', max_depth=5)
accuracy: 0.872 recall_macro: 0.862 precision_macro: 0.886 f1_macro: 0.866
Running DecisionTreeClassifier(class_weight='balanced', min_impurity_decrease=0.01)
accuracy: 0.872 recall_macro: 0.862 precision_macro: 0.886 f1_macro: 0.866
Running LogisticRegression(C=0.1, class_weight='balanced', max_iter=1000)
accuracy: 0.962 recall_macro: 0.967 precision_macro: 0.967 f1_macro: 0.961
Running LogisticRegression(class_weight='balanced', max_iter=1000)
accuracy: 0.969 recall_macro: 0.973 precision_macro: 0.971 f1_macro: 0.969
=== new best LogisticRegression(class_weight='balanced', max_iter=1000) (using recall_macro):
accuracy: 0.969 recall_macro: 0.973 precision_macro: 0.971 f1_macro: 0.969
Best model:
LogisticRegression(class_weight='balanced', max_iter=1000)
Best Scores:
accuracy: 0.969 recall_macro: 0.973 precision_macro: 0.971 f1_macro: 0.969
precision recall f1-score support
0 1.00 1.00 1.00 13
1 0.95 1.00 0.97 19
2 1.00 0.92 0.96 13
accuracy 0.98 45
macro avg 0.98 0.97 0.98 45
weighted avg 0.98 0.98 0.98 45
[[13 0 0]
[ 0 19 0]
[ 0 1 12]]
/home/circleci/project/dabl/plot/utils.py:375: UserWarning: FixedFormatter should only be used together with FixedLocator
ax.set_yticklabels(
from dabl.models import SimpleClassifier
from dabl.explain import explain
from sklearn.datasets import load_wine
from sklearn.model_selection import train_test_split
wine = load_wine()
X_train, X_test, y_train, y_test = train_test_split(wine.data, wine.target)
sc = SimpleClassifier()
sc.fit(X_train, y_train)
explain(sc, X_test, y_test)
Total running time of the script: ( 0 minutes 0.806 seconds)