Sign Up

Sign In

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Sorry, you do not have permission to ask a question, You must login to ask a question.

SIKSHAPATH Latest Articles

Introduction to Machine Learning Assignment Answers Week 2 2022 IITKGP

Introduction to ML banner of NPTEL Answer Hint Sikshapath Series

Students, Do you want a hint in the assignment of NPTEL Introduction to Machine Learning Week 2? If yes, then your choice is absolutely correct. Here, in this article, you will get the answer’s hint of the Introduction to Machine Learning assignment.

NPTEL Introduction to Machine Learning Assignment Answers Week 2

Q1. In a binary classification problem, out of 30 data points 12 belong to class I and 18 belong to class II. What is the entropy of the data set?

a. 0.97
b. 0
c. 1
d. 0.67

Answer: c. 1

For instant notification of future updates, Join us on telegram.


Q2. Decision trees can be used for the problems where

a. the attributes are categorical.

b. the attributes are numeric valued.

c. the attributes are discrete valued.

d. In all the above cases.

Answer: d. In all the above cases.


Q3. Which of the following is false?

a. Variance is the error of the trained classifier with respect to the best classifier in the concept class

b. Variance depends on the training set size

c. Variance increases with more training data

d. Variance increases with more complicated classifiers

Answer: c. Variance increases with more training data


Q4. In linear regression, our hypothesis is hθ(x) = θ0 + θ1 x, the training data is given in the table.

XY
67
54
109
34

If the cost function is J(θ) = 1/2m i=1Σm (h θ(xi) – yi)², where m is no. of training data points.
What is the value of J(θ) when θ= (1,1).

a. 0

b. 1

c. 2

d. 0.5

Answer: Do it yourself.


Q5. The value of information gain in the following decision tree is:

Introduction to Machine Learning Assignment Answers Week 2 2022 IITKGP

a. 0.380

b. 0.620

c. 0.190

d. 0.477

Answer: a. 0.380


Q6. What is true for Stochastic Gradient Descent?

a. In every iteration, model parameters are updated for multiple training samples

b. In every iteration, model parameters are updated for one training sample

c. In every iteration, model parameters are updated for all training samples

d. None of the above

Answer: b. In every iteration, model parameters are updated for one training sample


Answer Questions 7-8 with the data given below:

ISRO wants to discriminate between Martians (M) and Humans (H) based on the following features: Green ε {N,Y}, Legs ε {2,3}, Height ε {S,T}, Smelly ε {N,Y}. The training data is as follows:

SpeciesGreenLegsHeightSmelly
MN3SY
MY2TN
MY3TN
MN2SY
MY3TN
HN2TY
HN2SN
HN2TN
HY2SN
HN2TY

Q7. The entropy of the entire dataset is

a. 0.5

b. 1

c. 0

d.  0.1

Answer: Do it yourself


Q8. Which attribute will be the root of the decision tree?

a. Green

b. Legs

c. Height

d. Smelly

Answer: b. Legs


Q9. In Linear Regression the output is:

a. Discrete

b. Continuous and always lies in a finite range

c. Continuous

d. May be discrete or continuous

Answer: c. Continuous


Q10. Identify whether the following statement is true or false.

“Overfitting is more likely when the set of training data is small”

a. True

b. False

Answer: a. True


TELEGRAM FOR NOTIFICATIONClick Here
Follow on Google News (in one click)Click Here

Disclaimer: These answers are provided only for the purpose of helping students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.


Also Available:

NPTEL Introduction to Machine Learning Assignment Answers Week 1

Related Posts

Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.

Powered By
100% Free SEO Tools - Tool Kits PRO