New preprint about Constrained Empirical Risk Minimization
Our latest research provides an innovative solution to the challenge of enforcing constraints on deep neural networks. We reframe the constraints on a network as an ordinary risk minimization problem on a Riemannian manifold. Subsequently, we harness Riemannian Stochastic Gradient Descent to ensure that constraints are strictly adhered to during the training process. This work demonstrates a practical implementation of this method in medical contour prediction using Convolutional Neural Networks, where we enforce the filters to be wavelets.