# NPTEL Deep Learning – IIT Ropar Assignment 4 Answers 2022

NPTEL Deep Learning – IIT Ropar Assignment 4 Answers 2022:- In this post, We have provided answers of Deep Learning – IIT Ropar Assignment 4. We provided answers here only for reference. Plz, do your assignment at your own knowledge.

## About Deep Learning IIT – Ropar

Deep Learning has received a lot of attention over the past few years and has been employed successfully by companies like Google, Microsoft, IBM, Facebook, Twitter etc. to solve a wide range of problems in Computer Vision and Natural Language Processing. In this course, we will learn about the building blocks used in these Deep Learning based solutions. Specifically,

we will learn about feedforward neural networks, convolutional neural networks, recurrent neural networks and attention mechanisms. We will also look at various optimization algorithms such as Gradient Descent, Nesterov Accelerated Gradient Descent, Adam, AdaGrad and RMSProp which are used for training such deep neural networks. At the end of this course, students would have knowledge of deep architectures used for solving various Vision and NLP tasks

CRITERIA TO GET A CERTIFICATE

Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

## NPTEL Deep Learning – IIT Ropar Assignment 4 Answers 2022

1. Consider the movement on the 3D error surface for Vannila Gradient Descent Algorithm. Select all the options that are TRUE.

a. Smaller the gradient, slower the movement
b. Larger the gradient, faster the movement
c. Gentle the slope, smaller the gradient
d. Steeper the slope, smaller the gradient

`Answer:- a, b, c`

2. Pick out the drawback in Vannila gradient descent algorithm.

a. Very slow movement on gentle slopes
b. Increased oscillations before converging
c. escapes minima because of long strides
d. Very slow movement on steep slopes

`Answer:- b`

3. Comment on the update at the tth update in the Momentum-based Gradient Descent.

b. Polynomial weighted average
c. Exponential weighted average of gradient
d. Average of recent three gradients

`Answer:- c`

4. Given a horizontal slice of the error surface as shown in the figure below, if the error at the position p is 0.49 then what is the error at point q?

a. 0.70
b. 0.69
c. 0.49
d. 0

`Answer:- c`

5. Identify the update rule for Nesterov Accelerated Gradient Descent.

`Answer:- c`

6. Select all the options that are TRUE for Line search.

a. w is updated using different learning rates
b. updated value of w always gives the minimum loss
c. Involves minimum calculation
d. Best value of Learning rate is used at every step

`Answer:- a, b, d`

7. Assume you have 1,50,000 data points, Mini batch size being 25,000, one epoch implies one pass over the data, and one step means one update of the parameters, What is the number of steps in one epoch for Mini-Batch Gradient Descent?

a. 1
b. 1,50,000
c. 6
d. 60

`Answer:- c`

8. Which of the following learning rate methods need to tune two hyperparameters?
I. step decay
II. exponential decay
III. 1/t decay

a. I and II
b. II and III
c. I and III
d. I, II and III

`Answer:- b`

9. How can you reduce the oscillations and improve the stochastic estimates of the gradient that is estimated from one data point at a time?

a. Mini-Batch
c. RMSprop

`Answer:- a`

10. Select all the statements that are TRUE.

a. RMSprop is very aggressive when decaying the learning rate
b. Adagrad decays the learning rate in proportion to the update history
d. RMSprop has overcome the problem of Adagrad getting stuck when close to convergence

`Answer:- b, c, d`