Phase 5: Neural Networks - Pre-Quizยถ
Time: 15 minutes
Questions: 10
Passing Score: 70%
Purpose: Assess your baseline knowledge before learning about neural networks
Question 1 (Easy)ยถ
What is a neural network inspired by?
A) Computer circuits
B) The human brain โ
C) Mathematical functions
D) Decision trees
Explanation
Answer: B) The human brain
Neural networks are loosely inspired by biological neurons in the brain. They consist of interconnected nodes (neurons) that process information, similar to how biological neurons fire signals.
Reference: Phase 5 - Introduction to Neural Networks
Question 2 (Easy)ยถ
What is the basic unit of a neural network called?
A) Cell
B) Node
C) Neuron โ
D) Layer
Explanation
Answer: C) Neuron
While โnodeโ is also commonly used, the formal term is โneuronโ or โartificial neuron,โ reflecting the biological inspiration.
Reference: Phase 5 - Neural Network Basics
Question 3 (Medium)ยถ
In a neural network, what does a โweightโ represent?
A) The size of the network
B) The strength of a connection between neurons โ
C) The number of neurons
D) The learning rate
Explanation
Answer: B) The strength of a connection between neurons
Weights are parameters that determine how much influence one neuron has on another. During training, these weights are adjusted to improve the networkโs performance.
Reference: Phase 5 - Weights and Biases
Question 4 (Medium)ยถ
What is the purpose of an activation function?
A) To speed up training
B) To introduce non-linearity โ
C) To reduce overfitting
D) To normalize inputs
Explanation
Answer: B) To introduce non-linearity
Activation functions introduce non-linearity into the network, allowing it to learn complex patterns. Without activation functions, a neural network would just be a linear regression model regardless of how many layers it has.
Reference: Phase 6 - Activation Functions
Question 5 (Easy)ยถ
What are the three main types of layers in a typical neural network?
A) Input, Output, Middle
B) Input, Hidden, Output โ
C) Start, Process, End
D) Data, Compute, Result
Explanation
Answer: B) Input, Hidden, Output
Input layer: Receives the initial data
Hidden layer(s): Process the data (can be multiple)
Output layer: Produces the final prediction
Reference: Phase 5 - Network Architecture
Question 6 (Hard)ยถ
What is backpropagation used for?
A) Forward pass computation
B) Calculating gradients for weight updates โ
C) Initializing weights
D) Preventing overfitting
Explanation
Answer: B) Calculating gradients for weight updates
Backpropagation is an algorithm that calculates the gradient of the loss function with respect to each weight using the chain rule of calculus. These gradients are then used to update the weights via gradient descent.
Reference: Phase 5 - Backpropagation
Question 7 (Medium)ยถ
What does the โlearning rateโ control?
A) How fast the network learns
B) The size of weight updates โ
C) The number of epochs
D) The batch size
Explanation
Answer: B) The size of weight updates
The learning rate (ฮฑ) determines how much the weights are adjusted during each training step. Too large: training becomes unstable. Too small: training is very slow.
Formula: new_weight = old_weight - learning_rate * gradient
Reference: Phase 5 - Training Hyperparameters
Question 8 (Hard)ยถ
import numpy as np
weights = np.array([0.5, -0.3, 0.8])
inputs = np.array([1, 2, 3])
bias = 0.1
output = np.dot(weights, inputs) + bias
What is the value of output?
A) 1.4
B) 2.0
C) 2.4 โ
D) 3.0
Explanation
Answer: C) 2.4
Calculation:
output = (0.5 ร 1) + (-0.3 ร 2) + (0.8 ร 3) + 0.1
= 0.5 + (-0.6) + 2.4 + 0.1
= 2.4
This is a simple neuron computation: weighted sum of inputs plus bias.
Reference: Phase 5 - Forward Pass
Question 9 (Medium)ยถ
What is the main difference between a shallow and a deep neural network?
A) Input size
B) Number of hidden layers โ
C) Activation function used
D) Learning rate
Explanation
Answer: B) Number of hidden layers
Shallow network: 1-2 hidden layers
Deep network: 3+ hidden layers (hence โdeep learningโ)
Deep networks can learn more complex, hierarchical representations of data.
Reference: Phase 5 - Deep Learning Basics
Question 10 (Hard)ยถ
Which statement about the vanishing gradient problem is TRUE?
A) It only affects the output layer
B) It makes gradients in early layers very small โ
C) It speeds up training
D) It only occurs with ReLU activation
Explanation
Answer: B) It makes gradients in early layers very small
In deep networks with certain activation functions (e.g., sigmoid, tanh), gradients can become extremely small as they propagate backward through layers. This makes it difficult to update weights in early layers.
Solution: Use ReLU activation or batch normalization
Reference: Phase 5 - Training Challenges
Scoring Guideยถ
0-3 correct (0-30%): Donโt worry! Neural networks are advanced. Start with the basics and take your time.
4-5 correct (40-50%): You have some foundation. Review mathematical concepts (linear algebra, calculus) before diving into neural networks.
6-7 correct (60-70%): Good baseline! Youโre ready to learn neural networks with some effort.
8-9 correct (80-90%): Excellent! You have strong fundamentals. This phase will solidify your knowledge.
10 correct (100%): Outstanding! You may already know this content, but youโll still learn implementation details and best practices.
Next Stepsยถ
After taking this pre-quiz:
Score < 70%: Review these prerequisites first:
Linear algebra (matrix multiplication)
Basic calculus (derivatives)
Python programming
Score โฅ 70%: Proceed with Phase 5 content
Retake after Phase 5: Take the post-quiz to measure your progress
Remember: A low score is perfectly normal! This quiz shows where youโre starting, not where youโll end up. ๐