사이킷런에서 제공하는 의사결정나무 코드를 실습한다.
from sklearn import tree
X = [[0, 0], [1, 1]]
Y = [0, 1]
clf = tree.DecisionTreeClassifier()
clf = clf.fit(X, Y)
clf.predict([[2., 2.]])
clf.predict_proba([[2., 2.]])
from sklearn.datasets import load_iris
from sklearn import tree
X, y = load_iris(return_X_y=True)
clf = tree.DecisionTreeClassifier()
clf = clf.fit(X, y)
tree.plot_tree(clf)
import graphviz
dot_data = tree.export_graphviz(clf, out_file=None)
graph = graphviz.Source(dot_data)
graph.render("iris")
dot_data = tree.export_graphviz(clf, out_file=None,
feature_names=iris.feature_names,
class_names=iris.target_names,
filled=True, rounded=True,
special_characters=True)
graph = graphviz.Source(dot_data)
graph
참조 : scikit-learn.org/stable/modules/tree.html#classification
1.10. Decision Trees — scikit-learn 0.23.2 documentation
1.10. Decision Trees Decision Trees (DTs) are a non-parametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the
scikit-learn.org
'머신러닝 > 알고리즘' 카테고리의 다른 글
4-1. 타이타닉 생존자 예측 학습 데이터 준비 (0) | 2020.09.11 |
---|---|
2-1-2. Decision Tree 실습 2 (0) | 2020.09.09 |
2-1. Decision Tree (의사결정나무) (0) | 2020.09.09 |
1-3-2. KNN 실습 2 (0) | 2020.09.08 |
1-3-1. KNN 실습 1 (0) | 2020.09.08 |