Neural Network Learning
CSC 261 - Artificial Intelligence - Weinman
Answer the following questions. Record your answers in your Reading
Journal.
- Explain how the last equation on p. 726, the loss gradient for learning
in logistic regression, relates to the penultimate form in the derivation
of the learning rule for output nodes in neural networks on p. 735,
which is
- Consider that the modified error Δk is defined as the error
at output unit k times the rate of change in the activation of
that unit. In your own words, briefly explain the analogy in meaning
for the modified error Δj at hidden unit j, as defined
in Equation (18.12) .
- Explain an example from your own experience of something akin to overfitting.