Wednesday 27 December 2023

Deep Learning and Reinforcement Learning Week 5 All Quiz

Practice: Transfer Learning 

1. The main idea of transfer learning of a neural network is:





ANSWER= (A) To keep the early layers of a pre-trained network and re-train the later layers for a specific application.

 

2. In the context of transfer learning, which is a guiding principle of fine tuning?





ANSWER= (B) Using data that is similar to the pre-trained network

 

3. In the context of transfer learning, what do we call the process in which you only train the last or a few layers instead of all layers of a neural network?





ANSWER= (A) Frozen layers


Practice: Convolutional Neural Network Architectures  

1. This concept came as a solution to CNNs in which each layer is turned into branches of convolutions:





ANSWER= (A) Inception

 

2. Which CNN Architecture is considered the flash point for modern Deep Learning?






ANSWER= (A) AlexNet

 

3. Which CNN Architecture can be described as a "simplified, deeper LeNet" in which the more layers, the better?






ANSWER= (C) VGG

4. Which CNN Architecture is the precursor of using convolutions to obtain better features and was first used to solve the MNIST data set?






ANSWER= (E) LeNet

 

5. The motivation behind this CNN Architecture was to solve the inability of deep neural networks to fit or overfit the training data better when adding layers.






ANSWER= (E) ResNet

 

6. This CNN Architecture keeps passing both the initial unchanged information and the transformed information to the next layer.






ANSWER= (E) ResNet

 

7. Which activation function was notably used in AlexNet and contributed to its success?





ANSWER= (A) ReLU (Rectified Linear Unit)

 


Practice: Regularization  

1. Which regularization technique can shrink the coefficients of the less important features to zero?





ANSWER= (D) L1

 

2. (True/False) Batch Normalization tackles the internal covariate shift issue by always normalizing the input signals, thus accelerating the training of deep neural nets and increasing the generalization power of the networks.



ANSWER= (A) True

 

3. Regularization is used to mitigate which issue in model training?





ANSWER= (C) Overfitting

 


Week 5 Final Quiz  

1. (True/False) In Keras, the Dropout layer has an argument called rate, which is a probability that represents how often we want to invoke the layer in the training.



ANSWER= (B) False

 

2. What is a benefit of applying transfer learning to neural networks?





ANSWER= (B) Save early layers for generalization before re-training later layers for specific applications.

 

3. By setting ` layer.trainable = False` for certain layers in a neural network, we____





ANSWER= (D) freeze the layers such that their weights don’t update during training.

 

4.Which option correctly orders the steps of implementing transfer learning?
1. Freeze the early layers of the pre-trained model.
2. Improve the model by fine-tuning.
3. Train the model with a new output layer in place.
4. Select a pre-trained model as the base of our training.





ANSWER= (B) 4, 1, 3, 2

 

5. Given a 100x100 pixels RGB image, there are _____ features.





ANSWER= (C) 30000

 

6. Before a CNN is ready for classifying images, what layer must we add as the last?





ANSWER= (A) Dense layer with the number of units corresponding to the number of classes

 

7. In a CNN, the depth of a layer corresponds to the number of:





ANSWER= (A) filters applied

No comments:

Post a Comment

Understanding Dependency Injection in Spring: A Simple Guide

When developing applications, it is typical for one class to rely on another for proper operation. For example, an automobile need an engine...