Hybrid Random-Local Search for High-Dimensional Optimization Without Convexity Assumption
-
AMSE, Université Aix-Marseille, France [pierre.BERTRAND.1@univ-amu.fr]
-
LPSM, Sorbonne Université, Paris, France [michel.broniatowski@sorbonne-universite.fr]
-
Maths Department, FAU Erlangen, Germany [stummer@math.fau.de]
Keywords: Optimization – Random local search – High dimension – Non convexity – Lasso – Neural network pruning
Abstract
We propose a method to minimize a continuous function over a subset of high-dimensional space without assuming convexity. The method alternates between Random Search (RS) and Local Search (LS) regimes to improve an eligible candidate in , the starting point of the algorithm.
Beyond the alternation between regimes, the originality of the method lies in leveraging high dimensionality. We demonstrate concentration properties under the regime. In parallel, we show that reaches any point in finite time.
Finally, two applications are proposed. The first is the reduction of the norm in the Lasso to demonstrate the relevance of the method. In a second, more useful application, we accelerate neural network inference by pruning weights while maintaining performance. Our approach achieves significant weight reduction with minimal performance loss, offering an effective solution for network optimization.
References
- Broniatowski and Stummer [2023] M. Broniatowski and W. Stummer. A precise bare simulation approach to the minimization of some distances 1. foundations. IEEE Transactions on Information Theory, 69(5):3062–3120, 2023.
- Guedon and Milman [2011] O. Guedon and E. Milman. Interpolating thin-shell and sharp large-deviation estimates for isotropic log-concave measures. Geometric And Functional Analysis, 21(5):1043–1068, 2011.
- Marino et al. [2023] G.C. Marino, A. Petrini, D. Malchiodi, and M. Frasca. Deep neural networks compression: A comparative survey and choice recommendations. Neurocomputing, 520:152–170, 2023.
- Wegner [2024] S.A. Wegner. Lecture notes on high-dimensional data, 2024.