M.Sc Thesis


M.Sc StudentSde Chen Yael
SubjectNeural Network for Cloud Computed Tomography
DepartmentDepartment of Electrical and Computer Engineering
Supervisor PROF. Yoav Schechner
Full Thesis textFull thesis text - English Version


Abstract

The atmosphere and clouds are vital for studies of climate, environmental science, modeling and forecasting of weather and pollution. Hence, the three-dimensional (3D) distribution of microphysical parameters of clouds are of great importance. Recent works attempt to perform 3D recovery by relying on multi-view images of clouds, where the recovery is posed as a data fitting problem using a forward model. The forward model is based on 3D radiative transfer (RT). In atmospheric and hydrologic remote sensing, multiple scattering is a major signal source. Multiple-scattering models result in a nonlinear inverse problem which is more complicated than common image recovery problems, e.g. medical tomography, which are based on linear models.


Nevertheless, finding the scatterer density distribution can be stated as an optimization problem that can be solved by gradient-based methods. Such techniques require the computation of the forward  model and its gradient. The forward model can be computed using explicit numerical methods such as the spherical harmonic discrete ordinate method (SHDOM), or alternatively by Monte Carlo RT.

However, the derivation of the gradient of the forward model poses a challenge due to coupling between the field variables in RT. To overcome the latter, some prior work suggested an iterative alternating algorithm in which a surrogate function is used. There, the coupling is avoided by fixing one field at each stage. This method allows a closed-form approximation of the gradient of the forward model and reduces the computational complexity. But this approach has two main problems. First, the solution significantly depends on an initial guess, because the forward model is non-linear in the unknowns. Second, the problem is very difficult to scale, and it runs typically on small domains.

To try and overcome these problems, we combine model-based signal processing with the power of deep neural networks.


The structure of our network is tailored and based on the characteristics of cloud fields. The advantage of this approach is that during scene analysis (retrieved at test stage), the complexity is fixed and determined by the number of neural layers. Furthermore, the neural network learns the forward model and its gradient using training data. Hence, the computation effort is invested once (during training) and is avoided at the test stage. This approach enables efficient recovery of 3D density of clouds.


We analyze and compare the accuracy and run time performance to state-of-the-art physics based methods, using simulated data. In addition to simulated data, we use real images captured by an aircraft, to test the network.