Introduction To Machine Learning – IITKGP Assignment 7 Answers 2022

NPTEL Introduction To Machine Learning – IITKGP Assignment 7 Answers 2022 [July-Dec]:- In this post, We have provided answers to NPTEL Introduction to Machine Learning – IITKGP Assignment 7 Week 7. We provided answers here only for reference. Please, do your assignment to your own knowledge.

About Introduction To Machine Learning – IITKGP

This course provides a concise introduction to the fundamental concepts of machine learning and popular machine learning algorithms. We will cover the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbour, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and kernels and neural networks with an introduction to Deep Learning. We will also cover the basic clustering algorithms. Feature reduction methods will also be discussed.

We will introduce the basics of computational learning theory. In the course, we will discuss various issues related to the application of machine learning algorithms. We will discuss hypothesis space, overfitting, bias and variance, tradeoffs between representational power and learnability, evaluation strategies and cross-validation. The course will be accompanied by hands-on problem-solving with programming in Python and some tutorial sessions.

CRITERIA TO GET A CERTIFICATE

Average assignment score = 25% of the average of the best 6 assignments out of the total 8 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF YOUR AVERAGE ASSIGNMENT SCORE >=10/25 AND YOUR EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

Introduction To Machine Learning – IITKGP Assignment 7 Answers NPTEL 2022 [July-Dec]

1.Which of the following option is/ are correct regarding the benetfits of ensemble model?

1. Better performance
2. More generalized model
3. Better interpretability

A) 1 and 3
B) 2 and 3
C)1 and 2
D) 1, 2 and 3

Answer:- c

2. In AdaBoost, we give more weights to ponts having been misclassified in previous iterations. Now, if we introduced a limit or cap on the weight that any point can take (for example, say we introduce a restriction that prevents any point’s weight from exceeding a value of 10). Which among the following would be an effect of such a modification?

A) We nmay observe the performance of the classıfier reduce as the number of stagesincrease.
B) It makes the final classifier robust to outliers.
C) It may result in lower overall performance.
D) None of these.

Answer:- b, c

Answers will be Uploaded Shortly and it will be Notified on Telegram, So JOIN NOW

Introduction To Machine Learning – IITKGP Assignment 7 Answers 2022

3. Which among the following are some of the differences between bagging and boosting?

A) In bagging we use the same classification algorithm for training on each sample of the data, whereas in boosting, we use different classification algorithms on the different training data samples.
B) Bagging 1s easy to parallelize whereas boosting 1s inherently a sequential process.
C) In bagging we typically use sampling with replacement whereas in boosting, we typically use weighted sampling techniques.
D) In comparison with the performance of a base classifier on a particular dataset, bagging will generally not increase the error whereas as boosting may leadto an increase in the error.

Answer:- b, c, d

4. What is the VC-dimension of the class of sphere in a 3-dimensional plane?

A) 3
B) 4
C) 5
D) 6

Answer:- a

5. Considering the AdaBoost algorithm, which among the following statements is true?

A)In each stage, we try to train a classifier which makes accurate predictions on anysubset of the data points where the subset size is at least half the size of the data et.
B) In each stage, we try to tran a classifier which makes accurate predictions on a subset of the data points where the subset contains more of the data points whichwere misclassified in earlier stages.
C) The weight assigned to an undividual classifier depends upon the number of data points correctly classified by the classifier.
D) The weight assigned to an individual classifier depends upon the weighted sumerror of misclassified ponts for that classifier.

Answer:- b, d

6. Suppose the VC dimension of a hypothesis space is 6. Which of the following are true?

A) At least one set of 6 points can be shattered by the hypothesis space.
B) Two sets of 6 points can be shattered by the hypothesis space.
C) All sets of 6 points can be shattered by the hypothesis space.
D) No set of 7 points can be shattered by the hypothesis space.

Answer:- a, d

👇For Week 08 Assignment Answers👇

Introduction To Machine Learning – IITKGP Assignment 7 Answers 2022

7. Ensembles will yield bad results when there is a sıgnificant diversıty among the models. Write True or False.

A) True
B) False

Answer:- b

8. Which of the following algorithms are not an ensemble learning algorithm?

A) Random Forest
B) Adaboost
C) Gradient Boosting
D) Decision Tress

Answer:- d

9. Which of the following can be true for selecting base learners for an ensemble?

A) Different learners can come from same algorithm with different hyper parameters
B) Different learmers can come firom different algorithms
C)Different learners can come from different training spaces
D) All of the above.

Answer:- d

10. Generally, an ensemble method works better, if the indıvidual base models have____________?

Note: Indvidual models have accuracy greater than 50%

A) Less corelation among predictions
B) High correlation among predictions
C) Correlation does not have an impact on the ensemble output
D) None of the above.

Answer:- a

For More NPTEL Answers:- CLICK HERE

Join Our Telegram:- CLICK HERE

Leave a Comment