Are you looking for help in Machine Learning NPTEL week 6 assignment answers? So, here in this article, we have provided Machine Learning week 6 assignment answer’s hint.
NPTEL Introduction to Machine Learning Assignment Answers Week 6
Q1. In training a neural network, we notice that the loss does not increase in the first few starting epochs: What is the reason for this?
a. The learning Rate is low.
b. Regularization Parameter is High.
c.Stuck at the Local Minima.
d. All of these could be the reason.
Answer: d. All of these could be the reason
900+ Students benefited from instant notification, will you? Join us on telegram.
Q2. What is the sequence of the following tasks in a perceptron?
i. Initialize the weights of the perceptron randomly.
ii. Go to the next batch of data set.
iii. If the prediction does not match the output, change the weights.
iv. For a sample input, compute an output.
A. I, II, III, IV
B. IV, III, II, I
C. III, I, II, IV
D. I, IV, III, II
Answer: D. I, IV, III, II
Q3. Suppose you have inputs as x, y, and z with values -2, 5, and -4 respectively. You have a neuron ‘q’ and neuron ‘f with functions:
q = x+y
f=q*z
Graphical representation of the functions is given in original assignment :
What is the gradient of F with respect to x, y, and z?
a. (-3, 4, 4)
b. (4, 4, 3)
c. (-4,-4, 3)
d. (3,-4,-4)
Answer: c. (-4,-4, 3)
Q4. A neural network can be considered as multiple simple equations stacked together. Suppose we want to replicate the function for the mentioned decision boundary.
What will be the final equation?
a. (h1 AND NOT h2) OR (NOT h1 AND h2)
b. (h1 OR NOT h2) AND (NOT h1 OR h2)
c. (h1 AND h2) OR (h1 OR h2)
d. None of these
Answer: a. (h1 AND NOT h2) OR (NOT h1 AND h2)
Q5. Which of the following is true about model capacity (where model capacity means theability of neural network to approximate complex functions)?
a. As number of hidden layers increase, model capacity increases
b. As dropout ratio increases, model capacity increases
c. As learning rate increases, model capacity increases
d. None of these.
Answer: a. As number of hidden layers increase, model capacity increases
Q6. First Order Gradient descent would not work correctly (i.e. may get stuck) in which of the following graphs?
Answer: Option B
Q7. Which of the following is true?
Single layer associative neural networks do not have the ability to
I. Perform pattern recognition
II. Find the parity of a picture
III. Determine whether two or more shapes in a picture are connected or not
a. II and III are true
b. II is true
c. All of the above
d. None of the above
Answer: a. II and III are true
Q8. The network that involves backward links from outputs to the inputs and hidden layers is called as
a. Self-organizing Maps.
b. Perceptron
c. Recurrent Neural Networks.
d. Multi-Layered Perceptron
Answer: c. Recurrent Neural Networks.
Q9. Intersection of linear hyperplanes in a three-layer network can produce both convex and non convex surfaces. Is the statement true?
a. Yes
b. No
Answer : b. No
Q10. What is meant by the statement “Backpropagation is a generalized delta rule”?
a. Because backpropagation can be extended to hidden layer units
b. Because delta is applied to only to input and output layers, thus making it more generalized.
c. It has no significance
d. None of the above.
Answer: a. Because backpropagation can be extended to hidden layer units
TELEGRAM FOR NOTIFICATION | Click Here |
Follow on Google News (in one click) | Click Here |
Disclaimer: These answers are provided only for the purpose to help students to take references. This website does not claim any surety of 100% correct answers. So, this website urges you to complete your assignment yourself.
Also Available:
NPTEL Introduction to Machine Learning Assignment Answers Week 5
NPTEL Introduction to Machine Learning Assignment Answers Week 7