**Deep Learning Assignment 2 Answers 2023** :- Hello students in this article we are going to share Answers of NPTEL Deep Learning Assignment 1 Answers 2023. All the Answers are provided below to help the students as a reference, You must submit your assignment with your own knowledge.

## NPTEL Deep Learning Week 2 Assignment Answers 2023

**1. Choose the correct option regarding discriminant functions g(x) for multiclass classification (x is the feature vector to be classified) Statement i : Risk value R a; x) in Bayes minimum risk classifier can be used as a discriminant function. Statement ii: Negative of Risk value R (at|×) in Bayes minimum risk classifier can be used as a discriminant function. Statement iii: Aposteriori probability P(w; x) in Bayes minimum error classifier can be used as a discriminant function. Statement iv : Negative of Aposteriori probability P(w; x) in Bayes minimum error classifier can be used as a discriminant function. **

a. Only Statement i is true

b. Both Statements ii and ili are true

c. Both Statements i and iv are true

d. Both Statements i and iv are true

Answer :-c. Both Statements i and iv are true

**2. Which of the following is regarding functions of discriminant functions gi(x) i.e., f(g(x)) **

a. We can not use functions of discriminant functions f(g(x)), as discriminant functions for multiclass classification.

b. We can use functions of discriminant functions, f(g(x)), as discriminant functions for multiclass classification provided, they are constant functions i.e., f(g(x)) = C where C is a constant.

c. We can use functions of discriminant functions, f(g(x)), as discriminant functions for multiclass classification provided, they are monotonically increasing functions.

d. None of the above is true.

Answer :-c. We can use functions of discriminant functions, f(g(x)), as discriminant functions for multiclass classification provided, they are monotonically increasing functions.

**3. The class conditional probability density function for the class w _{i}; i.e., P(x| w_{i}) for a multivariate normal (or Gaussian) distribution (where x is a d dimensional feature vector) is given by **

Answer :-Click Here

**4. There are some data points for two different classes given below. Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)} Class 2 points: {(3, 0), (1, -2), (5, –2), (3, -4)} Compute the mean vectors μ**

_{1}and μ

_{2}for these two classes and choose the correct option.

a. μ_{1} = [2 6] and μ_{2} = [3 -1]

b. μ_{1} = [3 6] and μ_{2} = [2 -2]

c. μ_{1} = [3 6] and μ_{2} = [3 -2]

d. μ_{1} = [3 5] and μ_{2} = [2 -3]

Answer :-Click Here

**5. There are some data points for two different classes given below. ****Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)} ****Class 2 points: {(3, 0), (1, -2), (5, -2), (3, -4)}****Compute the covariance matrices Σ1 and Σ2 and choose the correct option.**

Answer :-Click Here

**6. There are some data points for two different classes given below.Class 1 points: {(2, 6), (3, 4), (3, 8), (4, 6)}Class 2 points: {(3, 0), (1, -2), (5, -2), (3, -4)}**

Answer :-Click Here

**7. Let ∑ _{i}; represents the covariance matrix for i^{th} class. Assume that the classes have the same co-variance matrix. Also assume that the features are statistically independent and have same co-variance. Which of the following is true? **

a. ∑

_{i}; = ∑, (diagonal elements of ∑ are zero)

b. ∑

_{i}; = ∑, (diagonal elements of 2 are non-zero and different from each other, rest of the elements are zero)

C. ∑

_{i}; =∑, (diagonal elements of 2 are non-zero and equal to each other, rest of the elements are zero)

d. None of these

Answer :-Click Here

**8. The decision surface between two normally distributed class w1 and w2 is shown on the figure. Can you comment which of the following is true?**

Answer :-Click Here

**9. **

Answer :-Click Here

**10. You are given some data points for two different class. Class 1 points: {(11, 11), (13, 11), (8, 10), (9, 9), (7, 7), (7, 5), (15, 3)} Class 2 points: {(7, 11), (15, 9), (15, 7), (13, 5), (14, 4), (9, 3), (11, 3)} Assume that the points are samples from normal distribution and a two class Bayesian classifier is used to classify them. Also assume the prior probability of the classes are equal i.e., P(w1) =P(wz) Which of the following is true about the corresponding decision boundary used in the classifier? (Choose correct option regarding the given statements) Statement i: Decision boundary passes through the midpoint of the line segment joining the means of two classes Statement ii: Decision boundary will be orthogonal bisector of the line joining the means of two classes. **

a. Only Statement i is true

b. Only Statement ii is true

c. Both Statement i and i are true

d. None of the statements are true

Answer :-Click Here

You must be logged in to post a comment.