The present paper introduces a near-future perception system called Previewed Reality. In a co-existence environment of a human and a robot, unexpected collisions between the human and the robot must be avoided to the extent possible. In many cases, the robot is controlled carefully so as not to collide with a human. However, it is almost impossible to perfectly predict human behavior in advance. On the other hand, if a user can determine the motion of a robot in advance, he/she can avoid a hazardous situation and exist safely with the robot. In order to ensure that a user perceives future events naturally, we developed a near-future perception system named Previewed Reality. Previewed Reality consists of an informationally structured environment, a VR display or an AR display, and a dynamics simulator. A number of sensors are embedded in an informationally structured environment, and information such as the position of furniture, objects, humans, and robots, is sensed and stored structurally in a database. Therefore, we can forecast possible subsequent events using a robot motion planner and a dynamics simulator and can synthesize virtual images from the viewpoint of the user, which will actually occur in the near future. The viewpoint of the user, which is the position and orientation of a VR display or an AR display, is also tracked by an optical tracking system in the informationally structured environment, or the SLAM technique on an AR display. The synthesized images are presented to the user by overlaying these images on a real scene using the VR display or the AR display. This system provides human-friendly communication between a human and a robotic system, and a human and a robot can coexist safely by intuitively showing the human possible hazardous situations in advance.
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering
- Human-Computer Interaction
- Hardware and Architecture
- Computer Science Applications