A Comparison between Squared Error and Relative Entropy Metrics Using Several Optimization Algorithms
Raymond L. Watrous
Siemens Corporate Research
755 College Road East, Princeton, NJ 08540
Abstract
Convergence rates and generalization performance are compared for the squared error metric and a relative entropy metric on a contiguity problem using several optimization algorithms. The relative entropy measure converged to a good solution slightly more often than the squared error metric given the same distribution of initial weights. However, where the results differed, the squared error metric converged on average more rapidly to solutions that generalized better to the test data. These results are not in complete agreement with some results previously published.