Approximate Nearest Neighbor Fields in Video


Nir Ben-Zrihem and Lihi Zelnik-Manor
Technion Haifa, Israel

setup

Abstract


We introduce RIANN (Ring Intersection Approximate Nearest Neighbor search), an algorithm for matching patches of a video to a set of reference patches in real-time. For each query, RIANN finds potential matches by intersecting rings around key points in appearance space. Its search complexity is reversely correlated to the amount of temporal change, making it a good fit for videos, where typically most patches change slowly with time. Experiments show that RIANN is up to two orders of magnitude faster than previous ANN methods, and is the only solution that operates in real-time. We further demonstrate how RIANN can be used for real-time video processing and provide examples for a range of real-time video applications, including colorization, denoising, and several artistic effects.


Paper


Nir Ben-Zrihem and Lihi Zelnik-Manor, "Approximate Nearest Neighbor Fields in Video", CVPR 2015. [pdf]

 

Results

 

Results

 

Code


RIANN with local dictionary (trained from the first frame of the video) [download]
RIANN with global dictionary (trained from the multiple frames of multiple videos) [download]
This code was tested on Windows 64-bit Matlab 2012b , 2013a & 2015a.



The code is for academic purposes only. Please cite this paper if you make use of it:

@conference{RIANN, title={RIANN: Approximate Nearest Neighbor Fields in Video}, author={Ben-Zrihem, N. and Zelnik-Manor, L.}, year = {2015}, booktitle = {CVPR}}




In case of any problems please contact us at: bentzinir (at) gmail.com or hovav (at) ee.technion.ac.il