Correct Answer : Both LMS error & Gradient descent learning law
Explanation : weight update rule minimizes the mean squared error(delta square), averaged over all inputs & this laws is derived using negative gradient of error surface weight space.
Correct Answer : Small clusters
Explanation : Input samples associated with same neuron get reduced.
Correct Answer : binary
Explanation : Adaptive resonance theory take care of stability plasticity dilemma.
Correct Answer : None of the above
Explanation : Vigilance parameter in ART determines the tolerance of matching process.