Person recognition from gait images is generally not robust to changes in appearance, such as variations of the walking direction. In general conventional methods have focused on training a model to transform gait features or gait images to those at a different viewpoint, but the performance gets worse in case the model is not trained at a viewpoint of a subject. In this paper we propose a novel gait recognition approach which differs a lot from existing approaches in that the subject's sequential 3D models and his/her motion are directly reconstructed from captured images, and arbitrary viewpoint images are synthesized from the reconstructed 3D models for the purpose of gait recognition robust to changes in the walking direction. Moreover, we propose a gait feature, named Frame Difference Frieze Pattern (FDFP), which is robust to high frequency noise. The efficiency of the proposed method is demonstrated through experiments using a database that includes 41 subjects.