Article records
https://feeds.library.caltech.edu/people/Oymak-Samet/article.rss
A Caltech Library Repository Feedhttp://www.rssboard.org/rss-specificationpython-feedgenenSat, 13 Apr 2024 00:01:55 +0000Sharp MSE Bounds for Proximal Denoising
https://resolver.caltech.edu/CaltechAUTHORS:20160825-102038882
Authors: {'items': [{'id': 'Oymak-Samet', 'name': {'family': 'Oymak', 'given': 'Samet'}}, {'id': 'Hassibi-B', 'name': {'family': 'Hassibi', 'given': 'Babak'}, 'orcid': '0000-0002-1375-5838'}]}
Year: 2016
DOI: 10.1007/s10208-015-9278-4
Denoising has to do with estimating a signal x_0 from its noisy observations y = x_0 + z. In this paper, we focus on the "structured denoising problem," where the signal x_0 possesses a certain structure and z has independent normally distributed entries with mean zero and variance σ^2. We employ a structure-inducing convex function f(⋅) and solve min_x {1/2 ∥y−x∥^2_2 +σλf(x)}to estimate x_0, for some λ>0. Common choices for f(⋅) include the ℓ_1 norm for sparse vectors, the ℓ_1 −ℓ_2 norm for block-sparse signals and the nuclear norm for low-rank matrices. The metric we use to evaluate the performance of an estimate x∗ is the normalized mean-squared error NMSE(σ)=E∥x∗ − x_0∥^2_2/σ^2. We show that NMSE is maximized as σ→0 and we find the exact worst-case NMSE, which has a simple geometric interpretation: the mean-squared distance of a standard normal vector to the λ-scaled subdifferential λ∂f(x_0). When λ is optimally tuned to minimize the worst-case NMSE, our results can be related to the constrained denoising problem min_(f(x)≤f(x_0)){∥y−x∥2}. The paper also connects these results to the generalized LASSO problem, in which one solves min_(f(x)≤f(x_0)){∥y−Ax∥2} to estimate x_0 from noisy linear observations y=Ax_0 + z. We show that certain properties of the LASSO problem are closely related to the denoising problem. In particular, we characterize the normalized LASSO cost and show that it exhibits a "phase transition" as a function of number of observations. We also provide an order-optimal bound for the LASSO error in terms of the mean-squared distance. Our results are significant in two ways. First, we find a simple formula for the performance of a general convex estimator. Secondly, we establish a connection between the denoising and linear inverse problems.https://authors.library.caltech.edu/records/fbe5r-xn288