Sequential data assimilation: Information fusion of a numerical simulation and large scale observation data

Kazuyuki Nakamura, Tomoyuki Higuchi, Naoki Hirose

    Research output: Contribution to journalArticlepeer-review

    24 Citations (Scopus)

    Abstract

    Data assimilation is a method of combining an imperfect simulation model and a number of incomplete observation data. Sequential data assimilation is a data assimilation in which simulation variables are corrected at every time step of observation. The ensemble Kalman filter is developed for a sequential data assimilation and frequently used in geophysics. On the other hand, the particle filter developed and used in statistics is similar in view of ensemble-based method, but it has different properties. In this paper, these two ensemble based filters are compared and characterized through matrix representation. An application of sequential data assimilation to tsunami simulation model with a numerical experiment is also shown. The particle filter is employed for this application. An erroneous bottom topography is corrected in the numerical experiment, which demonstrates that the particle filter is useful tool as the sequential data assimilation method.

    Original languageEnglish
    Pages (from-to)608-626
    Number of pages19
    JournalJournal of Universal Computer Science
    Volume12
    Issue number6
    Publication statusPublished - Aug 11 2006

    All Science Journal Classification (ASJC) codes

    • Theoretical Computer Science
    • Computer Science(all)

    Fingerprint

    Dive into the research topics of 'Sequential data assimilation: Information fusion of a numerical simulation and large scale observation data'. Together they form a unique fingerprint.

    Cite this