NPTEL Deep Learning – IIT Ropar Week 8 Assignment Answer 2023

Join Our WhatsApp Group Join Now
Join Us On Telegram Join Now

NPTEL Deep Learning – IIT Ropar Week 8 Assignment Solutions

NPTEL Deep Learning - IIT Ropar Assignment Answers
NPTEL Deep Learning – IIT Ropar Assignment Answer 2023

NPTEL Deep Learning – IIT Ropar Week 8 Assignment Answer 2023

1. Which of the following best describes the concept of saturation in deep learning?

  • When the activation function output approaches either 0 or 1 and the gradient is close to zero.
  • When the activation function output is very small and the gradient is close to zero.
  • When the activation function output is very large and the gradient is close to zero.
  • None of the above.
Answer :-For Answer Click Here

2. Which of the following methods can help to avoid saturation in deep learning?

  • Using a different activation function.
  • Increasing the learning rate.
  • Increasing the model complexity
  • All of the above.
Answer :- For Answer Click Here

3. Which of the following is true about the role of unsupervised pre-training in deep learning?

  • It is used to replace the need for labeled data
  • It is used to initialize the weights of a deep neural network
  • It is used to fine-tune a pre-trained model
  • It is only useful for small datasets
Answer :- For Answer Click Here

4. Which of the following is an advantage of unsupervised pre-training in deep learning?

  • It helps in reducing overfitting
  • Pre-trained models converge faster
  • It improves the accuracy of the model
  • It requires fewer computational resources
Answer :- For Answer Click Here

5. What is the main cause of the Dead ReLU problem in deep learning?

  • High variance
  • High negative bias
  • Overfitting
  • Underfitting
Answer :- For Answer Click Here

6. How can you tell if your network is suffering from the Dead ReLU problem?

  • The loss function is not decreasing during training
  • The accuracy of the network is not improving
  • A large number of neurons have zero output
  • The network is overfitting to the training data
Answer :- For Answer Click Here

7. What is the mathematical expression for the ReLU activation function?

  • f(x) = x if x < 0, 0 otherwise
  • f(x) = 0 if x > 0, x otherwise
  • f(x) = max(0,x)
  • f(x) = min(0,x)
Answer :- For Answer Click Here

8. What is the main cause of the symmetry breaking problem in deep learning?

  • High variance
  • High bias
  • Overfitting
  • Equal initialization of weights
Answer :- For Answer Click Here

9. What is the purpose of Batch Normalization in Deep Learning?

  • To improve the generalization of the model
  • To reduce overfitting
  • To reduce bias in the model
  • To ensure that the distribution of the inputs at different layers doesn’t change
Answer :- For Answer Click Here

10. In Batch Normalization, which parameter is learned during training?

  • Mean
  • Variance
  • γ
  • ϵ
Answer :- For Answer Click Here
Course NameDeep Learning – IIT Ropar
CategoryNPTEL Assignment Answer
Home Click Here
Join Us on TelegramClick Here

Leave a comment