Document Type

Conference Proceeding

Publication Date

2018

Abstract

We study the loss surface of neural networks equipped with a hinge loss criterion and ReLU or leaky ReLU nonlinearities. Any such network defines a piecewise multilinear form in parameter space. By appealing to harmonic analysis we show that all local minima of such network are non-differentiable, except for those minima that occur in a region of parameter space where the loss surface is perfectly flat. Non-differentiable minima are therefore not technicalities or pathologies; they are heart of the problem when investigating the loss of ReLU networks. As a consequence, we must employ techniques from nonsmooth analysis to study these loss surfaces. We show how to apply these techniques in some illustrative cases.

Recommended Citation

Laurent, T. & Brecht, J.. (2018). The Multilinear Structure of ReLU Networks. Proceedings of the 35th International Conference on Machine Learning, in PMLR 80:2908-2916

laurent18b-supp.pdf (302 kB)
Supplementary PDF

Included in

Mathematics Commons

Share

COinS