Please use this identifier to cite or link to this item:
http://hdl.handle.net/1942/8458
Title: | Light mixture estimation for spatially varying white balance | Authors: | Hsu, E MERTENS, Tom Paris, S Avidan, S. Durand, F |
Issue Date: | 2008 | Publisher: | ASSOC COMPUTING MACHINERY | Source: | ACM TRANSACTIONS ON GRAPHICS, 27(3). p. 70-70 | Abstract: | White balance is a crucial step in the photographic pipeline. It ensures the proper rendition of images by eliminating color casts due to differing illuminants. Digital cameras and editing programs provide white balance tools that assume a single type of light per image, such as daylight. However, many photos are taken under mixed lighting We propose a white balance technique for scenes with two light types that are specified by the user. This covers many typical situations involving indoor/outdoor or flash/ambient light mixtures. Since we work from a single image, the problem is highly under-constrained. Our method recovers a set of dominant material colors which allows us to estimate the local intensity mixture of the two light types. Using this mixture, we can neutralize the light colors and render visually pleasing images. Our method can also be used to achieve post-exposure relighting effects. | Notes: | MIT, Comp Sci & Artificial Intelligence Lab, Cambridge, MA 02139 USA. Hasselt Univ tUL IBBT, Expertise Ctr Digital Media, Diepenbeek, Belgium. Adobe Syst Inc, San Jose, CA 95110 USA. | Keywords: | image processing; computational photography; white balance; color constancy | Document URI: | http://hdl.handle.net/1942/8458 | Link to publication/dataset: | http://doi.acm.org/10.1145/1360612.1360669 | ISSN: | 0730-0301 | e-ISSN: | 1557-7368 | ISI #: | 000258262000059 | Category: | A1 | Type: | Journal Contribution | Validations: | ecoom 2009 |
Appears in Collections: | Research publications |
Show full item record
WEB OF SCIENCETM
Citations
95
checked on Sep 29, 2024
Page view(s)
96
checked on Jul 28, 2023
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.