TY - JOUR
T1 - Least squares superposition codes with bernoulli dictionary are still reliable at rates up to capacity
AU - Takeishi, Yoshinari
AU - Kawakita, Masanori
AU - Takeuchi, Jun'Ichi
N1 - Copyright:
Copyright 2016 Elsevier B.V., All rights reserved.
PY - 2014/5
Y1 - 2014/5
N2 - For the additive white Gaussian noise channel with average power constraint, sparse superposition codes with least squares decoding are proposed by Barron and Joseph in 2010. The codewords are designed by using a dictionary each entry of which is drawn from a Gaussian distribution. The error probability is shown to be exponentially small for all rates up to the capacity. This paper proves that when each entry of the dictionary is drawn from a Bernoulli distribution, the error probability is also exponentially small for all rates up to the capacity. The proof is via a central limit theorem-type inequality, which we show for this analysis.
AB - For the additive white Gaussian noise channel with average power constraint, sparse superposition codes with least squares decoding are proposed by Barron and Joseph in 2010. The codewords are designed by using a dictionary each entry of which is drawn from a Gaussian distribution. The error probability is shown to be exponentially small for all rates up to the capacity. This paper proves that when each entry of the dictionary is drawn from a Bernoulli distribution, the error probability is also exponentially small for all rates up to the capacity. The proof is via a central limit theorem-type inequality, which we show for this analysis.
UR - http://www.scopus.com/inward/record.url?scp=84899655106&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84899655106&partnerID=8YFLogxK
U2 - 10.1109/TIT.2014.2312728
DO - 10.1109/TIT.2014.2312728
M3 - Article
AN - SCOPUS:84899655106
SN - 0018-9448
VL - 60
SP - 2737
EP - 2750
JO - IRE Professional Group on Information Theory
JF - IRE Professional Group on Information Theory
IS - 5
M1 - 6776455
ER -