NPTEL Introduction To Machine Learning – IITKGP Assignment 6 Answers 2022 [July-Dec]:- In this post, We have provided answers to NPTEL Introduction to Machine Learning – IITKGP Assignment 6 Week 6. We provided answers here only for reference. Please, do your assignment to your own knowledge.
About Introduction To Machine Learning – IITKGP
This course provides a concise introduction to the fundamental concepts of machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and kernels and neural networks with an introduction to Deep Learning. We will also cover the basic clustering algorithms. Feature reduction methods will also be discussed.
We will introduce the basics of computational learning theory. In the course, we will discuss various issues related to the application of machine learning algorithms. We will discuss hypothesis space, overfitting, bias and variance, tradeoffs between representational power and learnability, evaluation strategies and cross-validation. The course will be accompanied by hands-on problem-solving with programming in Python and some tutorial sessions.
CRITERIA TO GET A CERTIFICATE
Average assignment score = 25% of the average of the best 6 assignments out of the total 8 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100
Final score = Average assignment score + Exam score
YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF YOUR AVERAGE ASSIGNMENT SCORE >=10/25 AND YOUR EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.
Introduction To Machine Learning – IITKGP Assignment 6 Answers NPTEL 2022 [July-Dec]
1. In training a neural network, we notice that the loss does not increase in the first few starting epochs: What is the reason for this?
A) The learning Rate is low.
B) Regularization Parameter is High.
C) Stuck at the Local Minima.
D) All of these could be the reason.
2. What is the sequence of the following tasks in a perception?
I) Initialize the weights of the perceptron randomly.
II) Go to the next batch of data set.
III) If the prediction does not match the output, change the weights.
IV) For a sample input, compute an output.
A) I, II, II, IV
B) IV, III, II,I
C) III, I, II, IV
D) I, IV, III, II
Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW
3. Suppose you have inputs as x, y, and z with values -2, 5, and -4 respectively. You have a neuron ‘q’ and neuron ‘f’ with functions:
q = x + y
f = q * z
Graphical representation of the functions is as follows:
What is the gradient of F with respect to x, y, and z?
A) (-3, 4, 4)
B) 4, 4, 3)
4. Aneural network can be considered as multiple simple equations stacked together. Suppose we want to replicate the function for the below mentioned decision boundary.
What will be the final equation?
A) h1 AND NOT h2) OR (NOT h1 AND h2)
B) (h1 OR NOT h2) AND (NOT hl OR h2)
C) (h1 AND h2) OR (hi OR h2)
D) None of these
5. Which of the following is true about model capacity (where model capacity means the ability of neural network to approximate complex functions)?
A) As number of hidden layers increase, model capacity increases
B) As dropout ratio increases, model capacity increases
C) As learning rate increases, model capacity increases
D) None of these.
6. First Order Gradient descent would not work correctly (1.e. may get stuck) in which of the following graphs?
👇For Week 07 Assignment Answers👇
7. Which of the following is true? Single layer associative neural networks do not have the ability to
I) Perform pattern recognition
II) Find the parity of a picture
III) Determine whether two or more shapes in a picture are connected or not
A) II and III are true
B) II is true
C) All of the above
D) None of the above
8. The network that involves backward links from outputs to the inputs and hidden layers is called as
A) Self-organizing Maps
C) Recurrent Neural Networks
D) Multi-Layered Perceptron
Answer:- For Answer Click Here
9. Intersection of linear hyperplanes in a three-layer network can produce both convex and non- convex surfaces. Is the statement true?
10. What is meant by the statement “Backpropagation is a generalızed delta rule”?
A) Because backpropagation can be extended to hidden layer units
B) Because delta is applied to only to input and output layers, thus making it more generalized.
C) It has no significance
D) None of the above.
For More NPTEL Answers:- CLICK HERE
Join Our Telegram:- CLICK HERE