NPTEL Introduction To Machine Learning Week 5 Assignment Answer 2023

NPTEL Introduction To Machine Learning Week 5 Assignment Solutions

NPTEL Introduction To Machine Learning Week 5 Assignment Answer 2023

1. The perceptron learning algorithm is primarily designed for:

  1. Regression tasks
  2. Unsupervised learning
  3. Clustering tasks
  4. Linearly separable classification tasks
  5. Non-linear classification tasks
Answer :- For Answer Click Here

2. The last layer of ANN is linear for and softmax for .

  • Regression, Regression
  • Classification, Classification
  • Regression, Classification
  • Classification, Regression
Answer :- For Answer Click Here

3. Consider the following statement and answer True/False with corresponding reason:
The class outputs of a classification problem with a ANN cannot be treated independently.

  1. True. Due to cross-entropy loss function
  2. True. Due to softmax activation
  3. False. This is the case for regression with single output
  4. False. This is the case for regression with multiple outputs
Answer :- For Answer Click Here

4. Given below is a simple ANN with 2 inputs X1,X2∈{0,1} and edge weights −3,+2,+2

NPTEL Introduction To Machine Learning Week 5 Assignment Answer 2023

Which of the following logical functions does it compute?

  1. XOR
  2. NOR
  3. NAND
  4. AND
Answer :- For Answer Click Here

5. Using the notations used in class, evaluate the value of the neural network with a 3-3-1 architecture (2-dimensional input with 1 node for the bias term in both the layers). The parameters are as follows

NPTEL Introduction To Machine Learning Week 5 Assignment Answer 2023

Using sigmoid function as the activation functions at both the layers, the output of the network for an input of (0.8, 0.7) will be (up to 4 decimal places)

  1. 0.7275
  2. 0.0217
  3. 0.2958
  4. 0.8213
  5. 0.7291
  6. 0.8414
  7. 0.1760
  8. 0.7552
  9. 0.9442
  10. None of these
Answer :- For Answer Click Here

6. If the step size in gradient descent is too large, what can happen?

  1. Overfitting
  2. The model will not converge
  3. We can reach maxima instead of minima
  4. None of the above
Answer :- For Answer Click Here

7. On different initializations of your neural network, you get significantly different values of loss. What could be the reason for this?

  1. Overfitting
  2. Some problem in the architecture
  3. Incorrect activation function
  4. Multiple local minima
Answer :- For Answer Click Here

8. The likelihood L(θ|X) is given by:

  1. P(θ|X)
  2. P(X|θ)
  3. P(X).P(θ)
  4. P(θ)P(X)
Answer :- For Answer Click Here

9. Why is proper initialization of neural network weights important?

  1. To ensure faster convergence during training
  2. To prevent overfitting
  3. To increase the model’s capacity
  4. Initialization doesn’t significantly affect network performance
  5. To minimize the number of layers in the network
Answer :- For Answer Click Here

10. Which of these are limitations of the backpropagation algorithm?

  1. It requires error function to be differentiable
  2. It requires activation function to be differentiable
  3. The ith layer cannot be updated before the update of layer i+1 is complete
  4. All of the above
  5. (a) and (b) only
  6. None of these
Answer :- For Answer Click Here
Course NameIntroduction To Machine Learning
CategoryNPTEL Assignment Answer
Home Click Here
Join Us on TelegramClick Here