## Linear Codes February 22, 2006

In *nearest neighbor decoding*, a received word is decoded as the codeword at closest Hamming distance.

**[Theorem]**

All linear codes contain the zero word.

**[Theorem]**

For a linear code using nearest neighbor decoding, whether or not a received word *r* is uniquely correctly decodable depends only on the error vector (also ray decoding is nearest neighbor decoding.

**The Generator Matrix**

A generator matrix *G* for an [*n*, *k*] code *C* is any *k* x *n* matrix whose rows form a basis for *C*.

**[Theorem]**

If *G* = [*I*_{k} | *A*] is a generator matrix for the [*n*, *k*] code *C* in standard form, then *H* = [-*A*^{T} | *I*_{n-k}] is a parity check matrix for *C*.

**Syndrome Decoding**

Let *H* be a parity check matrix for a linear code, and let *v* be any word. Then the syndrome of *v*, *syn*(*v*) = *vH*^{T}.

For a codeword,

x,syn(x) =0.

**[Theorem]**

If *syn*(*u*) = *syn*(*v*), then *u* and *v* belong to the same coset.

**Syndrome decoding**

The idea behind syndrome decoding is that given a received word, you can compute its syndrome to find which coset it belongs to (if you remember the syndromes of the coset leaders). The benefit is the savings in storage, since you don’t have to keep the whole slepian array.

(i) Calculate the syndrome

r H^{T}of the received wordr

(ii) Scan the stored list of coset leaders to find the coset leader,e, with the same syndrome

(iii) Decode to the codewordc = r – e

**Puncturing Codes**

Deleting the same corrdinate *i* in each codeword of a code is termed “puncturing” the code. The resulting code is still linear.

**Decoding and Shannons Theorem**

Assume that *c* ∈ *Z*_{2}^{n} is sent and *y* ∈ Z_{2}^{n} is received and decoded as *c* ∈ *Z*_{2}^{n}.

From Bayes’s rule

**Maximum a posteriori (MAP) decoding **

Choose

*c*=

*c*for the codeword

*c*with

*Pr*(

*c*|

*y*) maximum.

**Maximum Likelihood decoding **

Pick *c* = *c* for the codeword with *Pr*(*y*|*c*) maximum.

## Leave a Reply

You must be logged in to post a comment.