This paper presents a study about estimating the emotions conveyed in clips of background music (BGM) to be used in an automatic slideshow creation system. The system we aimed to develop, automatically tags each given pieces of background music with the main emotion it conveys, in order to recommend the most suitable music clip to the slideshow creators, based on the main emotions of embedded photos. As a first step of our research, we developed a machine learning model to estimate the emotions conveyed in a music clip and achieved 88% classification accuracy with cross-validation technique. The second part of our work involved developing a web application using Microsoft Emotion API to determine the emotions in photos, so the system can find the best candidate music for each photo in the slideshow. 16 users rated the recommended background music for a set of photos using a 5-point likert scale and we achieved an average rate of 4.1, 3.6 and 3.0 for the photo sets 1, 2, and 3 respectively of our evaluation task.