Students, Do you want a hint in the assignment of NPTEL Introduction to Machine Learning Week 2? If yes, then your choice is absolutely correct. Here, in this article, you will get the answer’s hint of the Introduction to Machine Learning assignment.

## NPTEL Introduction to Machine Learning Assignment Answers Week 2

**Q1. In a binary classification problem, out of 30 data points 12 belong to class I and 18 belong to class II. What is the entropy of the data set?**

a. 0.97

b. 0

c. 1

d. 0.67

**Answer:** c. 1

**For instant notification of future updates, Join us on telegram.**

**Q2. Decision trees can be used for the problems where**

a. the attributes are categorical.

b. the attributes are numeric valued.

c. the attributes are discrete valued.

d. In all the above cases.

**Answer**: d. In all the above cases.

**Q3. Which of the following is false?**

a. Variance is the error of the trained classifier with respect to the best classifier in the concept class

b. Variance depends on the training set size

c. Variance increases with more training data

d. Variance increases with more complicated classifiers

**Answer:** c. Variance increases with more training data

**Q4.** **In linear regression, our hypothesis is h _{θ}(x) = θ_{0} + θ_{1} x, the training data is given in the table.**

X | Y |
---|---|

6 | 7 |

5 | 4 |

10 | 9 |

3 | 4 |

**If the cost function is J(θ) = 1/2m _{i=1}Σ**

^{m}

**(h**

What is the value of J(θ) when θ= (1,1).

_{θ}(x_{i}) – y_{i})², where m is no. of training data points.What is the value of J(θ) when θ= (1,1).

a. 0

b. 1

c. 2

d. 0.5

**Answer:** Do it yourself.

**Q5. The value of information gain in the following decision tree is:**

a. 0.380

b. 0.620

c. 0.190

d. 0.477

**Answer:** a. 0.380

**Q6. What is true for Stochastic Gradient Descent?**

a. In every iteration, model parameters are updated for multiple training samples

b. In every iteration, model parameters are updated for one training sample

c. In every iteration, model parameters are updated for all training samples

d. None of the above

**Answer:** b. In every iteration, model parameters are updated for one training sample

Answer Questions 7-8 with the data given below:

ISRO wants to discriminate between Martians (M) and Humans (H) based on the following features: Green ε {N,Y}, Legs ε {2,3}, Height ε {S,T}, Smelly ε {N,Y}. The training data is as follows:

Species | Green | Legs | Height | Smelly |
---|---|---|---|---|

M | N | 3 | S | Y |

M | Y | 2 | T | N |

M | Y | 3 | T | N |

M | N | 2 | S | Y |

M | Y | 3 | T | N |

H | N | 2 | T | Y |

H | N | 2 | S | N |

H | N | 2 | T | N |

H | Y | 2 | S | N |

H | N | 2 | T | Y |

**Q7. The entropy of the entire dataset is**

a. 0.5

b. 1

c. 0

d. 0.1

**Answer:** Do it yourself

**Q8. Which attribute will be the root of the decision tree?**

a. Green

b. Legs

c. Height

d. Smelly

**Answer:** b. Legs

**Q9. In Linear Regression the output is:**

a. Discrete

b. Continuous and always lies in a finite range

c. Continuous

d. May be discrete or continuous

**Answer:** c. Continuous

**Q10. Identify whether the following statement is true or false.**

**“Overfitting is more likely when the set of training data is small”**

a. True

b. False

**Answer:** a. True

TELEGRAM FOR NOTIFICATION | Click Here |

Follow on Google News (in one click) | Click Here |

**Disclaimer:** These answers are provided only for the purpose of helping students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.

**Also Available:**

NPTEL Introduction to Machine Learning Assignment Answers Week 1