3D terrain generation and texture manipulation by voice input

Umair Azfar Khan, Yoshihiro Okada

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Computer graphics have been used for creating real and imaginative worlds by artists and programmers in mediums like games and commercial movies. Although the impact of 3D graphics has been immense, but the techniques for generating the virtual content have never become easy. As a result, the common user has remained detached from this area and people have not been able to create animated stories of their own. This paper introduces a new and novel way of storytelling by letting the users generate their own 3D content through voice input. This paper covers terrain generation and texture manipulation through voice input which helps the users define the basic scene layout of their stories. The content that is generated is currently limited; however with time, this will increase and also provide the user with more control over manipulating the generated content.

Original languageEnglish
Title of host publication4th Asian Conference on Intelligent Games and Simulation, GAME-ON ASIA 2012 - 4th Asian Simulation Technology Conference, ASTEC 2012
PublisherEUROSIS
Pages56-60
Number of pages5
ISBN (Electronic)9789077381687
Publication statusPublished - Jan 1 2012
Event4th Asian Simulation and AI in Games Conference, GAME-ON ASIA 2012 and the 4th Asian Simulation Technology Conference, ASTEC 2012 - Kyoto, Japan
Duration: Feb 24 2012Feb 26 2012

Other

Other4th Asian Simulation and AI in Games Conference, GAME-ON ASIA 2012 and the 4th Asian Simulation Technology Conference, ASTEC 2012
CountryJapan
CityKyoto
Period2/24/122/26/12

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Human-Computer Interaction
  • Modelling and Simulation

Fingerprint Dive into the research topics of '3D terrain generation and texture manipulation by voice input'. Together they form a unique fingerprint.

Cite this