Practice: Optimizers and Data Shuffling
1. True/False. Multi-layer perceptrons always have a hidden layer.
2. True/False. Multi-layer perceptrons are considered a type of feedforward neural network.
3. Select the correct rule of thumb regarding training a neural network. In general, as you train a neural network:
Week 3 Final Quiz
1. What is the main function of backpropagation when training a Neural Network?
2. (True/False) The “vanishing gradient” problem can be solved using a different activation function
3. (True/False) Every node in a neural network has an activation function.
4. These are all activation functions except:
5. Deep Learning uses deep Neural Networks for all these uses, except
6. These are all activation functions except:
7. (True/False) Optimizer approaches for Deep Learning Regularization use gradient descent:
8. Stochastic gradient descent is this type of batching method:
9. (True/False) The main purpose of data shuffling during the training of a Neural Network is to aid convergence and use the data in a different order each epoch.
10. This is a high-level library that is commonly used to train deep learning models and runs on either TensorFlow or Theano:
No comments:
Post a Comment