In this paper we propose four models of unmanned blimps: Robots, Pets, Agents, and Puppets, according to whether they are autonomous or not and whether they are shown to people or not. Robots and Pets are autonomous and Agents and Puppets are not autonomous. Robots and Agents are shown to people and Pets and Puppets are not shown to people. Based on these models, we approach toward interactive blimps as puppets, which visualize performances from people to people with real time effects and motions. We implemented prototype applications where people could make performances through controls of the blimp's light effects and flight motions with voice via mobile phones and a physical controller. We organized observations of these prototypes at a laboratory experiment and demo exhibitions. We also discuss our models based on spectators' experience.