Estimation of the vocal tract spectrum from articulatory movements using phoneme-dependent neural networks

Kohei Wakamiya, Tokihiko Kaburagi, Takuya Tsuji, Jiji Kim

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper presents an estimation method of the vocal tract spectrum from articulatory movements. The method is based on the interpolation of spectra obtained by phoneme-dependent neural networks. Given the phonemic context and articulation timing corresponding to each phoneme, the proposed method first transforms articulator positions to phoneme-dependent spectra. Then the vocal tract spectrum is estimated by the interpolation of transformed spectra. This interpolation is based on the distance among the input articulator position and that of the preceding and succeeding phonemes. Also, a training procedure of the neural networks is presented while taking the spectral interpolation into account. Articulatory and acoustic data pairs collected by a simultaneous recording of articulator positions and speech were used as the training and test data. Finally, we showed an estimation result using the proposed method.

Original languageEnglish
Pages517-520
Number of pages4
Publication statusPublished - Jan 1 2004
Event8th International Conference on Spoken Language Processing, ICSLP 2004 - Jeju, Jeju Island, Korea, Republic of
Duration: Oct 4 2004Oct 8 2004

Other

Other8th International Conference on Spoken Language Processing, ICSLP 2004
CountryKorea, Republic of
CityJeju, Jeju Island
Period10/4/0410/8/04

All Science Journal Classification (ASJC) codes

  • Language and Linguistics
  • Linguistics and Language

Fingerprint Dive into the research topics of 'Estimation of the vocal tract spectrum from articulatory movements using phoneme-dependent neural networks'. Together they form a unique fingerprint.

Cite this