- This event has passed.
EE PhD Colloquium on Imaging Inverse problems
August 23 @ 11:00 AM - 1:00 PM IST
Title : Improved Derivative based Regularizations for Imaging Inverse problems
Student : Manu GhulyaniAdvisor : Prof. Muthuvel Arigovindan
Date and Time: 23.08.2023 (Wednesday), 11 am.
Venue : MMCR, Department of Electrical Engineering
Images undergo degradation during the capturing process due to physical limitations inherent to the capturing devices. Addressing this degradation and recovering high-quality images constitute the image recovery problem, a crucial concern with diverse applications across various fields such as biology, astronomy, and medicine. The enhancement of captured image resolution significantly influences these disciplines. Examples of this problem include tasks like reconstructing computed tomography images, magnetic resonance imaging, image deconvolution, and microscopic image reconstruction.
Image recovery is frequently approached using regularization techniques, with derivative-based regularizations being popular due to their ability to exploit image smoothness, yielding interpretable results devoid of introduced artifacts. Total Variation regularization (TV), proposed by Rudin, Osher, and Fatemi, is a seminal approach for image recovery. TV involves the norm of the image’s gradient, aggregated over all pixel locations. As TV encourages minimal values in the derivative norm, it leads to piece-wise constant solutions, resulting in what is known as the “staircase effect.” To mitigate this effect, the Hessian Schatten norm regularization (HSN) employs second-order derivatives, represented by the pth norm of eigenvalues in the image hessian vector, summed across all pixels. HSN demonstrates superior structure-preserving properties compared to TV. However, HSN solutions tend to be overly smoothed. To address this, we introduce a non-convex shrinkage penalty applied to the Hessian’s eigenvalues, deviating from the convex lp norm. While the analytical form of this penalty was unknown, we derived the algorithm using proximal operations. We established that the proposed regularization adhered to restricted proximal regularity, ensuring the algorithm’s convergence. The images recovered by this regularization were sharper than the convex counterparts.
In the subsequent work, we extend the concept of the Hessian-Schatten norm. By encompassing Schatten norms of the Hessian and introducing a smoothness constraint, we broaden the scope of Hessian Schatten norm. The resulting regularization can be derived as a Lagrange dual of the Hessian Schatten norm, akin to the total generalized variation. Furthermore, we present an efficient variable splitting scheme for solving image restoration challenges.
Total Generalized Variation (TGV) represents an important generalization of Total Variation. TGV involves multiple orders of derivatives, with higher-order TGV leading to improved recovered image quality. This enhancement has been validated through numerical experiments in image denoising. Consequently, a demand arises for an algorithm capable of solving TGV for any order. While various methods address TGV regularization, many are confined to second-order TGV, and only a few explore orders greater than three for image recovery with TGV regularization. To our knowledge, no algorithm resolves image recovery challenges employing TGV regularization for orders exceeding three under a general forward model. This challenge arises from the intricate nature of TGV representation. We surmount this obstacle by presenting two simple matrix based representations of TGV: the direct and compact forms. We prove the equivalence of both forms with the original TGV definition. Leveraging the compact representation, we propose a generalized ADMM-based algorithm to solve TGV regularization for any order.
ALL ARE WECOME.