Low-density Parity-check Codes
2.3 Decoding of LDPC Codes
sequence of vertices starting and ending at the same vertex with no repetitions of vertices and edges allowed. The presence of short cycles in the Tanner graph significantly degrades the performance of iterative decoders specifically for short-length LDPC codes [34, 51, 70]. The length of the shortest cycle/cycles is known as thegirth which partially reflects the severity of the performance degradation.
In Section 2.5, we shall review some well-established strategies to construct a Tanner graph by avoiding harmful short cycles. As an example of cycle, consider the Tanner graph shown in Figure 2.1. A cycle of length 6 is shown in thick blue lines. The blue-colored 1s of H in (2.1) correspond to the edges present in the cycle. There is no cycle of length 4 in the Tanner graph. Thus the girth is 6.
2.3 Decoding of LDPC Codes
The MAP decoding is performed block-wise. Although optimal, the MAP decoding is not feasible for the LDPC codes of practical lengths because of the enormous number of codewords involved. In the SPA, the decoding is carried out in a bitwise manner as shown below. For the bitxj, the decoder output is given by
ˆ
xj = arg max
xj
Pr (xj|y, all checks involvingxj are satisfied). (2.5)
As xj ∈ {0,1}, the decision aboutxj can be made from the a posteriori LLRLj defined by
Lj ,logPr (xj = 0|y,all checks involvingxj are satisfied)
Pr (xj = 1|y,all checks involvingxj are satisfied). (2.6) In the above expression, Pr (xj =u|y,all checks involvingxj are satisfied) is the conditional proba- bility that xj = u, u ∈ {0,1} given that (i) all the CNs ci, i ∈ N(j) are satisfied (i.e all such ith components of the syndrome vector are zero) and (ii) the received sequence is y.
Now, (2.5) can be alternatively expressed as
ˆ
xj = 1−sgn (Lj)
2 . (2.7)
In the SPA,Ljs are updated iteratively by exchanging messages between the VNsvjs and the CNs cis. The extrinsic messages from vj are initialized with the channel LLR (intrinsic message) Lchj for xj given by
Lchj = logPr (xj = 0|yj)
Pr (xj = 1|yj). (2.8)
We consider the transmission of the codewords over an AWGN channel with mean zero and variance σn2. The codeword sequence x = [x1. . . xN]T is BPSK modulated to tx = [tx1. . . txN]T according to txj = 1−2xj. After the transmission of tx over an AWGN channel, the received sequence is given by y=tx+n, where n = [n1. . . nN]T and njs are independent and identically distributed Gaussian random variables with mean zero and varianceσ2n. Assumingxjs to be equally likely to take 0 and 1,
TH -1443_08610205
Lchj in (2.8) can now be expressed as
Lchj = log exp
−(yj2σ−1)2n 2
exp
−(yj2σ+1)2 2 n
= 2yj
σn2 . (2.9)
b b b b
| {z } {vj′:j′∈M(i)\{j}} vj′
ci
Vj′i
vj
Uij
(a)CN processing
b b b b
| {z }
{ci′:i′∈N(j)\{i}}
Lchj
Ui′j
ci′
vj
ci
Vji
(b)VN processing
Figure 2.3: Diagrammatic representation of the message passing in the SPA
The SPA in the LLR domain is described in Algorithm 2.1. At the beginning, the VN→CN messages are initialized with the channel LLR. The rest of the SPA can be presented in the following steps:
(i) CN processing: The messageUij sent from a CNci towards its neighboring VNvj is defined as
Uij ,logPr (checkci is satisfied|xj = 0,y) Pr (checkci is satisfied|xj = 1,y).
In the above expression, Pr (check ci is satisfied|xj =u,y) is the conditional probability that the ith CN is satisfied (i.e. the ith element of the syndrome vector is zero) given that (i) the value of xj isu and (ii) the received sequence is y. With the assumption that all the VNs connected
TH -1443_08610205
2.3 Decoding of LDPC Codes
Algorithm 2.1:Sum-Product Algorithm
input : A received sequencey= [y1. . . yN]T, the parity-check matrixH and the maximum number of iterations Im
output: Estimated codeword ˆx= [ˆx1. . .xˆN]T
Find the set of neighbors for each VN j ∈ {1, . . . , N},N(j) ={i:hij = 1}; Find the set of neighbors for each CN i∈ {1, . . . , M},M(i) ={j:hij = 1};
for j←1 to N do // initiliazation of the VN→CN messages Lchj = 2yσ2j
n ;
Set Vji =Lchj,∀i∈N(j);
end
for I ←1 toIm do // Loop for iterations
fori←1 toM do // CN processing
ComputeUij ∀j∈M(i):
Uij = log
1+ Q
j′∈M(i)\{j}
tanh
V
j′i 2
1− Q
j′∈M(i)\{j}
tanh
V
j′i 2
end
for j←1 to N do // VN processing
ComputeVji ∀i∈N(j):
Vji= P
i′∈N(j)\{i}
Ui′j +Lchj end
forj←1 to N do // Computation of a posteriori LLR Lj = P
i′∈N(j)
Ui′j+Lchj
end
for j←1 to N do // Hard decision
ˆ
xj = (1−sgn(L2 j)) end
if Hˆx=0 then // Stopping criterion
Stop;
end end
to ci send independent messages towardsci,Uij can be computed as
Uij = log
1 + Q
j′∈M(i)\{j}
tanhVj2′i
1− Q
j′∈M(i)\{j}
tanhVj2′i, (2.10)
whereM(i) ={j:hij = 1}is the set of the indices of the neighboring VNs connected toci and Vj′i is the message sent fromvj′ toci as described in the next item. The graphical representation of the CN processing atci is shown in Figure 2.3(a).
TH -1443_08610205
(ii) VN processing: The message Vji sent from a VNvj towards its neighboring CN ci is defined as
Vji,logPr (xj = 0|y,all checks involving xj except ci are satisfied) Pr (xj = 1|y,all checks involving xj except ci are satisfied).
With the assumption that all the checks connected tovj are conditionally independent given the value of xj,Vji can be computed as
Vji = X
i′∈N(j)\{i}
Ui′j +Lchj, (2.11)
where N(j) is the set of the indices of the neighboring CNs connected to vj, i.e., N(j) = {i:hij = 1}. The graphical representation of the VN processing atvj is shown in Figure 2.3(b).
(iii) Computation of a posteriori LLR: Thea posteriori LLRLj for the VNvj as formulated in (2.6) can be computed as
Lj = X
i′∈N(j)
Ui′j +Lchj. (2.12)
(iv) Hard decision and stopping criterion: Based on the sign of the a posteriori LLR, the hard decision is made according to (2.7). If a valid codeword is obtained, i.e., Hˆx=0, then the decoding process is stopped, otherwise it is continued till the maximum number of iterations are executed.