Assisting aging or disabled people is a vital problem in today's world, due to various reasons. For this purpose, wearable power-assist robots have been proposed to assist their activities of daily living. However, in some cases there are users not only motor ability, but also sensory ability is deteriorated. In such case, the user might not be able to perceive the environment properly. As a result of that, the user might not be able to interact with other people correctly and timely. This paper presents a method to estimate interacting motion intention for perception-assist with an upper-limb wearable power-assist robot. In this method, interacting motion intention is identified using visual information taken from the wearable camera. Motions classifier is trained to identify a similar motion early to assist the perception of the user. An experiment is carried out to evaluate the effectiveness of the proposed method. A framework is suggested to identify the motion intention in real-time, by combing the existing descriptors, in terms of computational time and accuracy. Based on the results obtained by the visual information, user's motion is suggested to be modified by the power-assist wearable robot to assist the perception of the user.