close this window
Entropic regularization of the $\ell_0$ function
first name:
Many problems of interest where more than one solution is possible seek, among
these, the one that is sparsest. The objective that most directly accounts for
sparsity, the $\ell_0$ metric, is usually avoided since this leads to a
combinatorial optimization problem. The function $\|x\|_0$ is often viewed
as the limit of the $\ell_p$ metrics. Naturally, there have been some attempts to use
this as an objective for $p$ small, though this is a nonconvex function for $p<1$.
We propose instead a scaled and shifted Fermi-Dirac entropy with two parameters,
one controlling the smoothness of the approximation and the other the
{\em steepness} of the metric.
Our proposed metric is a convex relaxation
for which a strong duality theory holds, yielding dual
methods for metrics approaching the desired $\|\cdot\|_0$ function.
Without smoothing, we propose a dynamically reweighted
subdifferential descent method with ``exact'' line search that is finitely
terminating for constraints that are well-separated. This
algorithm is shown to recapture in a special case certain well-known ``greedy''
algorithms. Consequently we are able to provide an explicit algorithm
whose fixed point, under the appropriate assumptions,
is the sparsest possible solution. The variational perspective
yields general strategies to make the algorithm more robust.