Are you looking for help in Machine Learning NPTEL week 5 assignment answers? So, here in this article, we have provided Machine Learning week 5 assignment answer’s hint.
NPTEL Introduction to Machine Learning Assignment Answers Week 5
Q1. What would be the ideal complexity of the curve which can be used for separating the two classes shown in the image below?
d. insufficient data to draw conclusion
Answer: a. Linear
750+ Students benefited from instant notification, will you? Join us on telegram.
Q2. Which of the following option is true?
A. Linear regression error values have to normally distributed but not in the case of the logistic regression
B. Logistic regression values have to be normally distributed but not in the case of the linear regression
C. Both linear and logistic regression error values have to be normally distributed
D. Both linear and logistic regression error values need not to be normally distributed
Answer: A. Linear regression error values have to normally distributed but not in the case of the logistic regression
Q3. Which of the following methods do we use to best fit the data in Logistic Regression?
a. Manhattan distance
b. Maximum Likelihood
c. Jaccard distance
d. Both A and B
Answer: b. Maximum Likelihood
Q4. Imagine, you have given the below graph of logistic regression which shows the relationships between cost function and number of iterations for 3 different learning rate values (different colors are showing different curves at different learning rates).
Suppose, you save the graph for future reference but you forgot to save the value of different learning rates for this graph. Now, you want to find out the relation between the leaning rate values of these curve. Which of the following will be the true relation?
Note: 1. The learning rate for blue is L1.
2. The learning rate for red is L2.
3. The learning rate for green is L3.
d. None of these
Answer: c. L1<L2<L3
Q5. State whether True or False.
After training an SVM, we can discard all examples which are not support vectors and can still classify new examples.
Answer: a. TRUE
Q6. Suppose you are dealing with 3 class classification problem and you want to train a SVM model on the data for that you are using One-vs-all method.
How many times we need to train our SVM model in such case?
Answer: c. 3
Q7. What is/are true about kernel in SVM?
- Kernel function map low dimensional data to high dimensional space
- It’s a similarity function
c. 1 and 2
d. None of these.
Answer: c. 1 and 2
Q8. Suppose you are using RBF kernel in SVM with high Gamma value. What does this signify?
a. The model would consider even far away points from hyperplane for modelling.
b. The model would consider only the points close to the hyperplane for modelling.
c. The model would not be affected by distance of points from hyperplane for modelling.
d. None of the above
Answer: b. The model would consider only the points close to the hyperplane for modelling.
Q9. Below are the labelled instances of 2 classes and hand drawn decision boundaries for logistic regression. Which of the following figure demonstrates overfitting of the training data?
d. None of these
Answer : c. C
Q10. What do you conclude after seeing the visualization in previous question?
C1. The training error in first plot is higher as compared to the second and third plot.
C2. The best model for this regression problem is the last (third) plot because it has minimum training error (zero).
C3. Out of the 3 models, the second model is expected to perform best on unseen data.
C4. All will perform similarly because we have not seen the test data.
a. C1 and C2
b. C1 and C3
c. C2 and C3
Answer: b. C1 and C3
Disclaimer: These answers are provided only for the purpose to help students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.