Updating Duplicate values in distributed multidatabase systems

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Redundant data for fast retrieval increase costs of updates in distributed multidatabase systems. In this paper, costs of updates are reduced by introducing further redundancies. Duplicate values - redundant data that are obtained from relationships of original values - must be updated to preserve their currency when original values are updated. To update duplicate values, relationships of original data are retrieved in order to determine which duplicate values must be updated. If a file with duplicate values satisfies the superkey condition introduced in this paper, we can perform the retrieval required for updates faster. Redundant data are added to a database site to satisfy the condition.

Original languageEnglish
Title of host publicationProceedings - 1st International Workshop on Interoperability in Multidatabase Systems, IMS 1991
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages243-246
Number of pages4
ISBN (Electronic)0818622059, 9780818622052
DOIs
Publication statusPublished - Jan 1 1991
Event1st International Workshop on Interoperability in Multidatabase Systems, IMS 1991 - Kyoto, Japan
Duration: Apr 7 1991Apr 9 1991

Publication series

NameProceedings - 1st International Workshop on Interoperability in Multidatabase Systems, IMS 1991

Conference

Conference1st International Workshop on Interoperability in Multidatabase Systems, IMS 1991
CountryJapan
CityKyoto
Period4/7/914/9/91

All Science Journal Classification (ASJC) codes

  • Computer Science(all)

Fingerprint Dive into the research topics of 'Updating Duplicate values in distributed multidatabase systems'. Together they form a unique fingerprint.

Cite this