Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Additive white Gaussian noise
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Achievability=== In this section, we show achievability of the upper bound on the rate from the last section. A codebook, known to both encoder and decoder, is generated by selecting codewords of length ''n'', i.i.d. Gaussian with variance <math>P-\varepsilon</math> and mean zero. For large n, the empirical variance of the codebook will be very close to the variance of its distribution, thereby avoiding violation of the power constraint probabilistically. Received messages are decoded to a message in the codebook which is uniquely jointly typical. If there is no such message or if the power constraint is violated, a decoding error is declared. Let <math>X^n(i)</math> denote the codeword for message <math>i</math>, while <math>Y^n</math> is, as before the received vector. Define the following three events: # Event <math>U</math>:the power of the received message is larger than <math>P</math>. # Event <math>V</math>: the transmitted and received codewords are not jointly typical. # Event <math>E_j</math>: <math>(X^n(j), Y^n)</math> is in <math>A_\varepsilon^{(n)}</math>, the [[typical set]] where <math>i \neq j</math>, which is to say that the incorrect codeword is jointly typical with the received vector. An error therefore occurs if <math>U</math>, <math>V</math> or any of the <math>E_i</math> occur. By the law of large numbers, <math>P(U)</math> goes to zero as n approaches infinity, and by the joint [[Asymptotic Equipartition Property]] the same applies to <math>P(V)</math>. Therefore, for a sufficiently large <math>n</math>, both <math>P(U)</math> and <math>P(V)</math> are each less than <math>\varepsilon</math>. Since <math>X^n(i)</math> and <math>X^n(j)</math> are independent for <math>i \neq j</math>, we have that <math>X^n(i)</math> and <math>Y^n</math> are also independent. Therefore, by the joint AEP, <math>P(E_j) = 2^{-n(I(X;Y)-3\varepsilon)}</math>. This allows us to calculate <math>P^{(n)}_e</math>, the probability of error as follows: : <math> \begin{align} P^{(n)}_e & \leq P(U) + P(V) + \sum_{j \neq i} P(E_j) \\ & \leq \varepsilon + \varepsilon + \sum_{j \neq i} 2^{-n(I(X;Y)-3\varepsilon)} \\ & \leq 2\varepsilon + (2^{nR}-1)2^{-n(I(X;Y)-3\varepsilon)} \\ & \leq 2\varepsilon + (2^{3n\varepsilon})2^{-n(I(X;Y)-R)} \\ & \leq 3\varepsilon \end{align} </math> Therefore, as ''n'' approaches infinity, <math>P^{(n)}_e</math> goes to zero and <math>R < I(X;Y) - 3\varepsilon</math>. Therefore, there is a code of rate R arbitrarily close to the capacity derived earlier.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)