Abstract
In 1995, Best et al. published a formula for the exact bit error probability for Viterbi decoding of the rate R=1/2, memory m=1 (2-state) convolutional encoder with generator matrix G(D)=(1 1+D) when used to communicate over the binary symmetric channel. Their formula was later extended to the rate R=1/2, memory m=2 (4-state) convolutional encoder with generator matrix G(D)=(1+D^2 1+D+D^2) by Lentmaier et al.
In this paper, a different approach to derive the exact bit error probability is described. A general recurrent matrix equation, connecting the average information weight at the current and previous states of a trellis section of the Viterbi decoder, is derived and solved. The general solution of this matrix equation yields a closed form expression for the exact bit error probability. As special cases, the expressions obtained by Best et al. for the 2-state encoder and by Lentmaier et al. for a 4-state encoder are obtained. The closed form expression derived in this paper is evaluated for various realizations of encoders, including rate R=1/2 and R=2/3 encoders, of as many as 16 states.
Moreover, it is shown that it is straightforward to extend the approach to communication over the quantized additive white Gaussian noise channel.
In this paper, a different approach to derive the exact bit error probability is described. A general recurrent matrix equation, connecting the average information weight at the current and previous states of a trellis section of the Viterbi decoder, is derived and solved. The general solution of this matrix equation yields a closed form expression for the exact bit error probability. As special cases, the expressions obtained by Best et al. for the 2-state encoder and by Lentmaier et al. for a 4-state encoder are obtained. The closed form expression derived in this paper is evaluated for various realizations of encoders, including rate R=1/2 and R=2/3 encoders, of as many as 16 states.
Moreover, it is shown that it is straightforward to extend the approach to communication over the quantized additive white Gaussian noise channel.
Original language | English |
---|---|
Pages (from-to) | 4635-4644 |
Journal | IEEE Transactions on Information Theory |
Volume | 58 |
Issue number | 7 |
DOIs | |
Publication status | Published - 2012 |
Subject classification (UKÄ)
- Electrical Engineering, Electronic Engineering, Information Engineering
Free keywords
- additive white Gaussian noise channel
- binary symmetric channel
- bit error probability
- convolutional code
- convolutional encoder
- exact bit error probability
- Viterbi decoding