Pre-pruning Decision Tree – depth restricted

In general, the deeper you allow your tree to grow, the more complex your model will become because you will have more splits and it captures more information about the data and this is one of the root causes of overfitting. We can limit the tree with max_depth of tree:

Feature Importance in Decision Tree

In scikit-learn, the feature_importances_ attribute is associated with tree-based models, such as Decision Trees, Random Forests, and Gradient Boosted Trees. This attribute provides a way to assess the importance of each feature (or variable) in making predictions with the trained model. When you train a tree-based model, the algorithm makes decisions at each node based…

Visualizing the Decision Tree

To visualize a decision tree in scikit-learn, you can use the plot_tree function from the sklearn.tree module. This function allows you to generate a visual representation of the decision tree. Here’s a simple example: To show the decision tree as text in scikit-learn, you can use the export_text function from the sklearn.tree module. This function…

Receiver Operating Characteristic (ROC) and Area Under Curve (AUC)

he term “Receiver Operating Characteristic” (ROC) originated in the field of signal detection theory during World War II. Initially, it was used to analyze and measure the performance of radar receivers. The ROC curve, in Machine Learning, is a graphical representation that illustrates the trade-off between true positive rate (sensitivity) and false positive rate (1…

Classification metrics: Accuracy, Precision, Recall, and F1-score

uppose we have a binary classification problem in which we have to predict two classes: 1 and 0. A machine learning model tends to make some mistakes by incorrectly classifying data points, resulting in a difference between the actual and predicted class of the data point. Four possible scenarios that can happen are: Clearly, we want…