Please use this identifier to cite or link to this item:
|Title:||Image and Video Decolorization by Fusion||Authors:||ANCUTI, Codruta
|Issue Date:||2010||Publisher:||Springer Verlag||Source:||Proceedings of the Asian Conference on Computer Vision (ACCV 2010).||Series/Report:||Lecture Notes in Computer Science||Series/Report no.:||6492||Abstract:||In this paper we present a novel decolorization strategy, based on image fusion principles. We show that by defining proper inputs and weight maps, our fusion-based strategy can yield accurate decolorized images, in which the original discriminability and appearance of the color images are well preserved. Aside from the independent R,G,B channels, we also employ an additional input channel that conserves color contrast, based on the Helmholtz-Kohlrausch effect. We use three different weight maps in order to control saliency, exposure and saturation. In order to prevent potential artifacts that could be introduced by applying the weight maps in a per pixel fashion, our algorithm is designed as a multi-scale approach. The potential of the new operator has been tested on a large dataset of both natural and synthetic images. We demonstrate the effectiveness of our technique, based on an extensive evaluation against the state-of-the-art grayscale methods, and its ability to decolorize videos in a consistent manner.||Notes:||Reprint Address: Ancuti, CO (reprint author), Hasselt Univ tUL IBBT, Expertise Ctr Digital Media, Wetenschapspk 2, B-3590 Diepenbeek, Belgium Addresses: 1. Hasselt Univ tUL IBBT, Expertise Ctr Digital Media, B-3590 Diepenbeek, Belgium||Keywords:||Color, Computer Science, Theory & Methods||Document URI:||http://hdl.handle.net/1942/11618||ISBN:||978-3-642-19314-9||ISI #:||000296690900007||Category:||C1||Type:||Proceedings Paper||Validations:||ecoom 2012|
|Appears in Collections:||Research publications|
Show full item record
Files in This Item:
|FINAL_ACCV_Fusion.pdf||Preprint||5.06 MB||Adobe PDF||View/Open|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.