In this paper, we propose a physics-based method to synthesize facial images in visible wavelengths from multi-band near infrared (NIR) images. The study on photometric properties of human skin shows that melanin and hemoglobin components are dominant factors that affect the skin appearance under different light spectrum. Specifically, a set of intensities observed at a certain surface point with varying wavelength is represented by a linear combination of both the pigment components. Our proposed method learns the spectral basis vectors, which describe absorbance due to both the pigments, from multispectral image dataset by using Independent Component Analysis (ICA). Then, our method estimates the coefficients, which are pixel-wise densities of both the pigments, from a multiband NIR image, and finally converts it to a visible light (VIS) image. We demonstrate that our proposed method works well for real facial images even though only a small dataset is available for learning basis vectors.