Matting and Compositing of Transparent and Refractive Objects

Sai-Kit Yeung

Chi-Keung Tang

Michael S. Brown

Sing Bing Kang

Abstract This paper introduces a new approach for matting and compositing transparent and refractive objects in photographs. The key to our work is an image-based matting model, termed the attenuation-refraction matte (ARM), that encodes plausible refractive properties of a transparent object along with its observed specularities and transmissive properties. We show that an object's ARM can be extracted directly from a photograph using simple user markup. Once extracted, the ARM is used to paste the object onto a new background with a variety of effects, including compound compositing, Fresnel effect, scene depth, and even caustic shadows. User-studies find our results favorable to those obtained with Photoshop as well as perceptually valid in most cases. Our approach allows photo-editing of transparent and refractive objects in a manner that produces realistic effects previously only possible via 3D models or environment matting.

Keywords: Matting and Compositing, Transparent Objects



We thank anonymous reviewers for their constructive comments. Special thanks to Holly Rushmeier for pointing out [Jagnow et al. 2008; Jimenez et al. 2009] and helpful suggestions on the perception study and Thomas Chow for his technical advice on professional photography. The research was supported by the Hong Kong Research Grant Council under grant numbers 620207 and 619208.


This paper describes a new approach for matting transparent and refractive objects from a photograph and compositing the extracted object into a new scene. To accomplish this task we have modified the opaque image matting and compositing equation to fuse refractive deformation, color attenuation, and foreground estimation. We term this extracted information the attenuation-refraction matte (ARM). In general, a single photograph is insufficient to extract accurate refractive properties of a transparent object. Our approach instead recovers plausible light-transport properties of the matted object, exploiting our visual systemˇ¦s tolerance to inaccuracies in refractive phenomena as previously demonstrated by work targeting image-based material editing [Khan et al. 2006].