Understanding the Loss Surface of Neural Networks for Binary Classification
Speaker(s): Ruoyu Sun (University of Illinois at Urbana-Champaign)
Time: 10:00-11:00 August 21, 2018
Venue: Room 9, Quan Zhai, BICMR
One of the major challenges of training neural networks is the non-convexity of the loss function, which can lead to many local minima. Due to the recent success of deep learning, it is widely conjectured that the local minima of neural networks may lead to similar training performance and thus not a big issue. In this talk, we discuss the loss surface of neural networks for binary classification. We provide a collection of necessary and sufficient conditions under which the neural network problem has no bad local minima. On the positive side, we prove that no bad local minima exist under a few conditions on the neuron types, the neural-network structure (e.g. skip-like connection), the loss function and the dataset. While there seem to be quite a few conditions, on the negative side, we provide dozens of counterexamples which show that bad local minima exist when these conditions do not hold. For example, ReLU neurons lead to bad local minima while increasing and strictly convex neurons (e.g. smooth versions of ReLUs) can eliminate bad local minima.