In this era of Internet of Things (IoT), the healthcare system is one of the fields that has received a lot of attention from researchers. Daily-life “things” and objects such as mobile phones, watches, or shoes are coupled with sensors to make health systems for monitoring, and managing people heath. Recently, some methods have been focused on using food photography and associated image-processing techniques to assess food nutrients to control calorie intake. However, one of the critical issues in such image-based dietary assessment tools is the accuracy and consistent estimation of the sizes and weights of the food portion in the image. In this paper, we propose a system that uses eating tools (cutlery) such as spoon, fork or chopsticks to measure the weight of a food in a picture, in order to estimate the calorie content of that food, for diet assessment and obesity prevention. Our system requires the user to take only a single image from the top with the cutlery in the picture. Using several image processing techniques and the EXIF metadata of the image, the system automatically estimates the diameter and the height of the food container and derives the food volume. Then, given the food type, the system combines the information about the container diameter, height and the food type to provide the weight of the food in the image. Our experiments show tenable results from the system which achieved an average relative error rate of 6.87% for the weight estimation, over the testing food images.