NPTEL Introduction To Machine Learning Week 6 Assignment Answer 2023

Join Our WhatsApp Group Join Now
Join Us On Telegram Join Now

NPTEL Introduction To Machine Learning Week 6 Assignment Solutions

NPTEL Introduction To Machine Learning Assignment Answer 2023

NPTEL Introduction To Machine Learning Week 6 Assignment Answer 2023

1. Which of the following is/are major advantages of decision trees over other supervised learning techniques? (Note that more than one choices may be correct)

  • Theoretical guarantees of performance
  • Higher performance
  • Interpretability of classifier
  • More powerful in its ability to represent complex functions
Answer :- For Answers Click Here

2. Increasing the pruning strength in a decision tree by reducing the maximum depth:

  • Will always result in improved validation accuracy.
  • Will lead to more overfitting.
  • Might lead to underfitting if set too aggressively.
  • Will have no impact on the tree’s performance.
  • Will eliminate the need for validation data.
Answer :- For Answers Click Here

3. Consider the following statements:
Statement 1:
Decision Trees are linear non-parametric models.
Statement 2: A decision tree may be used to explain the complex function learned by a neural network.

Both the statements are True.
Statement 1 is True, but Statement 2 is False.
Statement 1 is False, but Statement 2 is True.
Both the statements are False.

Answer :- For Answers Click Here

4. Consider the following dataset:

NPTEL Introduction To Machine Learning Week 6 Assignment Answer 2023

What is the initial entropy of Malignant?

  • 0.543
  • 0.9798
  • 0.8732
  • 1
Answer :- For Answers Click Here

5. For the same dataset, what is the info gain of Vaccination?

  • 0.4763
  • 0.2102
  • 0.1134
  • 0.9355
Answer :- For Answers Click Here

6. Which of the following machine learning models can solve the XOR problem without any transformations on the input space?

  • Linear Perceptron
  • Neural Networks
  • Decision Trees
  • Logistic Regression
Answer :- For Answers Click Here

7. Statement: Decision Tree is an unsupervised learning algorithm.
Reason: The splitting criterion use only the features of the data to calculate their respective measures

  • Statement is True. Reason is True.
  • Statement is True. Reason is False.
  • Statement is False. Reason is True.
  • Statement is False. Reason is False.
Answer :- For Answers Click Here

8. ______ is a measurement of likelihood of an incorrect classification of a new instance for a random variable, if the new instance is randomly classified as per the distribution of class labels from the data set.

  • Gini impurity.
  • Entropy.
  • Information gain.
  • None of the above.
Answer :- For Answers Click Here

9. What is a common indicator of overfitting in a decision tree?

  • The training accuracy is high while the validation accuracy is low.
  • The tree is shallow.
  • The tree has only a few leaf nodes.
  • The tree’s depth matches the number of attributes in the dataset.
  • The tree’s predictions are consistently biased.
Answer :- For Answers Click Here

10. Consider a dataset with only one attribute(categorical). Suppose, there are 10 unordered values in this attribute, how many possible combinations are needed to find the best split-point for building the decision tree classifier? (considering only binary splits)

  • 10
  • 511
  • 1023
  • 512
Answer :- For Answers Click Here
Course NameIntroduction To Machine Learning
CategoryNPTEL Assignment Answer
Home Click Here
Join Us on TelegramClick Here

Leave a comment