טכניון מכון טכנולוגי לישראל
הטכניון מכון טכנולוגי לישראל - בית הספר ללימודי מוסמכים  
Ph.D Thesis
Ph.D StudentKuybeda Oleg
SubjectAnomaly Preserving Redundancy Reduction in High Dimensional
Signals
DepartmentDepartment of Electrical Engineering
Supervisors Professor Emeritus David Malah
Dr. Meir Bar-Zohar
Full Thesis textFull thesis text - English Version


Abstract

In this research we address the problem of anomaly preserving redundancy-reduction of high-dimensional noisy signals. Since anomalies contribute weakly to the -norm of the signal as compared to noise, classical -based approaches are unsatisfactory for obtaining a good representation of anomalies. Here we develop new signal-subspace estimation techniques that aim to represent well not only -significant signal, but also anomaly vectors, by optimizing a new criterion that is more sensitive to anomalies. First, we propose a greedy anomaly-preserving algorithm for signal-subspace and rank estimation. We denote it as: Maximum Orthogonal-Complements Algorithm (MOCA). MOCA combines and  norms and considers two aspects: One aspect deals with signal-subspace estimation aiming to minimize the maximum of data-residual -norms, denoted as, for a given rank. The other determines whether the rank conjecture is valid.

In the next part of the research we adapt MOCA for anomaly detection, discrimination, and population estimation of anomalies, in hyperspectral images. The proposed approach is denoted as Anomaly Extraction and Discrimination Algorithm (AXDA). The main idea of AXDA is to iteratively reduce the anomaly vector subspace-rank, found by MOCA, making the related anomalies to be poorly represented. This helps to detect them by a statistical analysis of the -norm of data residuals. As a by-product, AXDA provides also a robust estimate of an anomaly-free background subspace and its rank.

Next, we develop an optimal algorithm for the -norm minimization, which we call Maximum Orthogonal complements Optimal Subspace Estimation (MOOSE). The optimization is performed via a natural conjugate gradient learning approach carried out on the set of  dimensional subspaces in, which is a Grassmann manifold. We propose to initialize MOOSE by the signal-subspace obtained by MOCA.

Since contemporary hyperspectral imagers are not ample for mobile applications, there is a demand for multispectral imagers. Finally, we propose a novel unsupervised algorithm for designing multispectral filters that are tuned for local anomaly detection algorithms. The problem of designing Multispectral Filters is formulated as a problem of channel reduction in hyperspectral images, which is performed by replacing subsets of adjacent spectral bands by their means. An optimal partition of hyperspectral bands is obtained by minimizing the Maximum of Mahalanobis Norms (MXMN) of errors, obtained due to misrepresentation of hyperspectral bands by constants. By minimizing the MXMN of errors, one reduces the anomaly contribution to the errors, which allows retaining more anomaly-related information in the obtained channels.