In this paper, we describe a new one-shot scanning technique using a camera and a projector. Generally, a 3D measurement system based on a camera and a projector requires pre-calibration, such as the measurement of the relative position of these devices. If we can eliminate the calibration process, it would greatly improve the convenience of the system. For example, a single capture by a handheld camera of an object illuminated by a hand-held projector would then allow to reconstruct the object shape. To achieve this, we propose a self-calibration technique using a projected grid pattern, computing the relative pose of projector and camera. This is similar to the relative pose or motion problem for two cameras, but in our case correspondences are not explicitly given. The actual algorithm is based on a simple exhaustive search of a finite set of hypotheses, with a cost function based on the epipolar constraint. In the experiments, successful reconstructions with our proposed method using synthetic and read data are presented.