Sorry, you do not have permission to ask a question, You must login to ask a question.

# Introduction to Machine Learning Assignment Answers Week 2 2022 IITKGP

Students, Do you want a hint in the assignment of NPTEL Introduction to Machine Learning Week 2? If yes, then your choice is absolutely correct. Here, in this article, you will get the answer’s hint of the Introduction to Machine Learning assignment.

## NPTEL Introduction to Machine Learning Assignment Answers Week 2

Q1. In a binary classification problem, out of 30 data points 12 belong to class I and 18 belong to class II. What is the entropy of the data set?

a. 0.97
b. 0
c. 1
d. 0.67

Q2. Decision trees can be used for the problems where

a. the attributes are categorical.

b. the attributes are numeric valued.

c. the attributes are discrete valued.

d. In all the above cases.

Answer: d. In all the above cases.

Q3. Which of the following is false?

a. Variance is the error of the trained classifier with respect to the best classifier in the concept class

b. Variance depends on the training set size

c. Variance increases with more training data

d. Variance increases with more complicated classifiers

Answer: c. Variance increases with more training data

Q4. In linear regression, our hypothesis is hθ(x) = θ0 + θ1 x, the training data is given in the table.

If the cost function is J(θ) = 1/2m i=1Σm (h θ(xi) – yi)², where m is no. of training data points.
What is the value of J(θ) when θ= (1,1).

a. 0

b. 1

c. 2

d. 0.5

Q5. The value of information gain in the following decision tree is:

a. 0.380

b. 0.620

c. 0.190

d. 0.477

Q6. What is true for Stochastic Gradient Descent?

a. In every iteration, model parameters are updated for multiple training samples

b. In every iteration, model parameters are updated for one training sample

c. In every iteration, model parameters are updated for all training samples

d. None of the above

Answer: b. In every iteration, model parameters are updated for one training sample

Answer Questions 7-8 with the data given below:

ISRO wants to discriminate between Martians (M) and Humans (H) based on the following features: Green ε {N,Y}, Legs ε {2,3}, Height ε {S,T}, Smelly ε {N,Y}. The training data is as follows:

Q7. The entropy of the entire dataset is

a. 0.5

b. 1

c. 0

d.  0.1

Q8. Which attribute will be the root of the decision tree?

a. Green

b. Legs

c. Height

d. Smelly

Q9. In Linear Regression the output is:

a. Discrete

b. Continuous and always lies in a finite range

c. Continuous

d. May be discrete or continuous

Q10. Identify whether the following statement is true or false.

“Overfitting is more likely when the set of training data is small”

a. True

b. False

Disclaimer: These answers are provided only for the purpose of helping students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.

Also Available:

NPTEL Introduction to Machine Learning Assignment Answers Week 1

## NPTEL Programming In Java Week 6 Assignment Answers 2023 