Q No: 1
What is the final objective of Decision Tree?
- Maximise the Gini Index of the leaf nodes
- Minimise the homogeneity of the leaf nodes
- Maximise the heterogeneity of the leaf nodes
- Minimise the impurity of the leaf nodes
Ans: Minimise the impurity of the leaf nodes
In decision tree, after every split we hope to have lesser 'impurity' in the subsequent node. So that, eventually we end up with leaf nodes that have the least 'impurity'/entropy
Q No: 2
Decision Trees can be used to predict
- Continuous Target Variables
- Categorical Target Variables
- Random Variables
- Both Continuous and Categorical Target Variables
Ans: Both Continuous and Categorical Target Variables
Q No: 3
When we create a Decision Tree, how is the best split determined at each node?
- We split the data using the first independent variable and so on.
- The first split is determined randomly and from then on we start choosing the best split.
- We make at most 5 splits on the data using only one independent variable and choose the split that gives the highest Gini gain.
- We make all possible splits on the data using the independent variables and choose the split that gives the highest Gini gain.
Ans: We make all possible splits on the data using the independent variables and choose the split that gives the highest Gini gain.
Q No: 4
Which of the following is not true about Decision Trees
- Decision Trees tend to overfit the test data
- Decision Trees can be pruned to reduce overfitting
- Decision Trees would grow to maximum possible depth to achieve 100% purity in the leaf nodes, this generally leads to overfitting.
- Decision Trees can capture complex patterns in the data.
Ans: Decision Trees tend to overfit the test data
Q No: 5
If we increase the value of the hyperparameter min_samples_leaf from the default value, we would end up getting a ______________ tree than the tree with the default value.
- smaller
- bigger
Ans: smaller
min_samples_leaf = the minimum number of samples required at a leaf node
As the number of observations required in the leaf node increases, the size of the tree would decrease
Q No: 6
Which of the following is a perfectly impure node?
- Node - 0
- Node - 1
- Node - 2
- None of these
Ans: Node - 1
Gini = 0.5 at Node 1
gini = 0 -> Perfectly Pure
gini = o.5 -> Perfectly Impure
Q No: 7
In a classification setting, if we do not limit the size of the decision tree it will only stop when all the leaves are:
- All leaves are at the same depth
- of the same size
- homogenous
- heterogenous
Ans: homogenous
The tree will stop splitting after the impurity in every leaf is zero
Q No: 8
Which of the following explains pre-pruning?
- Before pruning a decision tree, we need to create the tree. This process of creating the tree before pruning is known as pre-pruning.
- Starting with a full-grown tree and creating trees that are sequentially smaller is known as pre-pruning
- We stop the decision tree from growing to its full length by bounding the hyper parameters, this is known as pre-pruning.
- Building a decision tree on default hyperparameter values is known as pre-pruning.
Ans: We stop the decision tree from growing to its full length by bounding the hyper parameters, this is known as pre-pruning.
Q No: 9
Which of the following is the same across Classification and Regression Decision Trees?
- Type of predicted variable
- Impurity Measure/ Splitting Criteria
- max_depth parameter
Ans: max_depth parameter
Q No: 10
Select the correct order in which a decision tree is built:
- Calculate the Gini impurity after each split
- Decide the best split based on the lowest Gini impurity
- Repeat the complete process until the stopping criterion is reached or the tree has achieved homogeneity in leaves.
- Select an attribute of data and make all possible splits in data
- Repeat the steps for every attribute present in the data
- 4,1,3,2,5
- 4,1,5,2,3
- 4,1,3,2,5
- 4,1,5,3,2
Ans: 4,1,5,2,3
0 comments:
Post a Comment