|M.Sc Student||Nir Baram|
|Subject||Approximate Nearest Neighbor Search for Video|
|Department||Department of Electrical Engineering||Supervisor||Professor Zelnik-Manor Lihi|
|Full Thesis text|
We introduce RIANN (Ring Intersection Approximate Nearest Neighbor), a patchmatching algorithm for nding ANNs in video. Designed to leverage statistical properties of videos, its search complexity is correlated to the amount of temporal dierence between frames. RIANN leverages temporal continuity by using the matches produced for frame t 1 as a pivot for nding the matches for frame t. Here, information is propagated through time, as opposed to previous methods that propagate information through space. Potential matches are extracted by intersecting high-dimensional rings (i.e. spheres) around key points in appearance space. RIANN requires a relatively long pre-processing stage, however, we show that this stage is global and can be learned once for multiple videos. The pre-processing stage is dedicated for creating a search model from a selected set of representative patches. Reference patches can be extracted from a single image or a video frame. However, unlike previous methods, it could also be a nonordered set of patches (e.g., a dictionary). Experiments show that RIANN is up to two orders of magnitude faster than previous methods and is the only solution that operates in real-time. In addition, we introduce a generic framework for real-time video processing based on the suggested RIANN. We show empirically that this framework can be used for a variety of applications, including colorization, denoising, and several artistic eects, all in real-time.