Many optimization problems arising in studying of random structures exhibit an apparent gap between the optimal values which can be estimated by non-constructive means, and the best values achievable by fast (polynomial time) algorithms. Through a combined effort of mathematicians, computer scientists and statistical physicists, it became apparent that a potential and in some cases a provable obstruction for designing algorithms bridging this gap is a phase transition in the geometry of nearly optimal solutions, in particular the presence of a certain Overlap Gap Property (OGP).
In this talk we discuss this property in the context of sparse high dimensional linear regression problem with Gaussian design. We show that, on the one hand, in the sampling regime where the known fast methods for this problem are effective, the space of solutions exhibits a monotonicity with respect to the proximity to the ground truth regression vector and no local optimums exist apart from the ground truth. On the other hand, once the sampling number is asymptotically in the regime where the known methods fail, we show that the monotonicity is lost, and the model exhibits an OGP. In the context of the regression problem this means every solution exhibiting a small mean squared error is either fairly close to the ground truth or is very far from it, with no middle ground.