Convolutional Decoding Using Matlab Functions - using viterbi decoder.

3 views (last 30 days)
I'm trying to perform convolutional decoding using built in Matlab functions. I'm trying to implement (2,1,7) :
Im trying to implement a viterbi decoder of a given message which the viterbi decoder has those parameters K=7 , rate = 1/2 , its output are two bits. the decision of the decoder of the message by minmum hamming distance method (to maximize the likelihood method)
how can I implement that in matlab?
I've seen this function already matlab provide it as already built function:
decoded=vitdec(coded,trellis,tblen,'trunc','hard')
but I didn't understand if that function is using using the min hamming distance to maximize the likelihood for viterbi decoder decision or what? and in my case for my viterbi decoder K=7 and its output is two bits ..what should the inputs to vitdec function be? thanks alot for any clarifications.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!