Sign Up

Sign In

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Sorry, you do not have permission to ask a question, You must login to ask a question.

SIKSHAPATH Latest Articles

NPTEL Introduction to Machine Learning Assignment Answers Week 7 2022 IITKGP

Machine Learning red color texted banner of NPTEL

Are you looking for help in Machine Learning NPTEL week 7 assignment answers? So, here in this article, we have provided Machine Learning week 7 assignment answer’s hint.

NPTEL Introduction to Machine Learning Assignment Answers Week 7

Q1. Which of the following option is/are correct regarding the benefits of ensemble model?

1. Better performance

2. More generalized model

3. Better interpretability

a. 1 and 3
b. 2 and 3
c. 1 and 2
d. 1, 2 and 3

Answer: c. 1 and 2

1000+ Students benefited from instant notification, will you? Join us on telegram.


Q2. In AdaBoost, we give more weights to points having been misclassified in previous iterations. Now, if we introduced a limit or cap on the weight that any point can take (for example, say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which among the following would be an effect of such a modification?

A. We may observe the performance of the classifier reduce as the number of stagesincrease.

B. It makes the final classifier robust to outliers.

C. It may result in lower overall performance.

D. None of these.

Answer: Option B and C.


Q3. Which among the following are some of the differences between bagging and boosting?

a.In bagging we use the same classification algorithm for training on each sample of the data, whereas in boosting, we use different classification algorithms on the different training data samples.

b. Bagging is easy to parallelize whereas boosting is inherently a sequential process.

c. In bagging we typically use sampling with replacement whereas in boosting, we typically use weighted sampling techniques.

d. In comparison with the performance of a base classifier on a particular dataset, bagging will generally not increase the error whereas as boosting may leadto an increase in the error.

Answer: Option B, C and D.


Q4. What is the VC-dimension of the class of sphere in a 3-dimensional plane?

a. 3

b. 4

c. 5

d. 6

Answer: a. 3


Q5. Considering the AdaBoost algorithm, which among the following statements is true?

a. In each stage, we try to train a classifier which makes accurate predictions on anysubset of the data points where the subset size is at least half the size of the data set.

b. In each stage, we try to train a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points whichwere misclassified in earlier stages.

c. The weight assigned to an individual classifier depends upon the number of data points correctly classified by the classifier.

d. The weight assigned to an individual classifier depends upon the weighted sumerror of misclassified points for that classifier.

Answer: Option B and D.


Q6. Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?

a. At least one set of 6 points can be shattered by the hypothesis space.

b. Two sets of 6 points can be shattered by the hypothesis space.

c. All sets of 6 points can be shattered by the hypothesis space.

d. No set of 7 points can be shattered by the hypothesis space.

Answer: Option A and D.


Q7. Ensembles will yield bad results when there is a significant diversity among the models. Write True or False.

a. True

b. False

Answer: b. False


Q8. Which of the following algorithms are not an ensemble learning algorithm?

a. Random Forest

b. Adaboost

c. Gradient Boosting

d. Decision Tress

Answer: d. Decision Tress


Q9. Which of the following can be true for selecting base learners for an ensemble?

a. Different learners can come from same algorithm with different hyper parameters

b. Different learners can come from different algorithms.

c. Different learners can come from different training spaces

d. All of the above.

Answer: d. All of the above.


Q10. Generally, an ensemble method works better, if the individual base models have _____________?

Note: Individual models have accuracy greater than 50%

a. Less correlation among predictions

b. High correlation among predictions

c. Correlation does not have an impact on the ensemble output

d. None of the above.

Answer: a. Less correlation among predictions


TELEGRAM FOR NOTIFICATIONClick Here
Follow on Google News (in one click)Click Here

Disclaimer: These answers are provided only for the purpose to help students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.


Also Available:

NPTEL Introduction to Machine Learning Assignment Answers Week 6

NPTEL Introduction to Machine Learning Assignment Answers Week 8

Related Posts

Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.

Powered By
100% Free SEO Tools - Tool Kits PRO