Article
 

High-Speed Low-Power Viterbi Decoder Design for TCM Decoders

Public Deposited

Downloadable Content

Download PDF
https://ir.library.oregonstate.edu/concern/articles/kh04dq890

Descriptions

Attribute NameValues
Creator
Abstract
  • High-speed, low-power design of Viterbi decoders for trellis coded modulation (TCM) systems is presented in this paper. It is well known that the Viterbi decoder (VD) is the dominant module determining the overall power consumption of TCM decoders. We propose a pre-computation architecture incorporated with T-algorithm for VD, which can effectively reduce the power consumption without degrading the decoding speed much. A general solution to derive the optimal pre-computation steps is also given in the paper. Implementation result of a VD for a rate-3/4 convolutional code used in a TCM system shows that compared with the full trellis VD, the pre-computation architecture reduces the power consumption by as much as 70% without performance loss, while the degradation in clock speed is negligible.
  • This is the author's peer-reviewed final manuscript, as accepted by the publisher. The published article is copyrighted by IEEE and can be found at: http://ieeexplore.ieee.org/xpl/tocresult.jsp?isnumber=6257480. ©2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
  • Keywords: Trellis coded modulation (TCM), Viterbi decoder, VLSI
Resource Type
DOI
Date Available
Date Issued
Citation
  • He, J., Liu, H., Wang, Z., Huang, X., & Zhang, K. (2012). High-speed low-power viterbi decoder design for TCM decoders. IEEE Transactions on Very Large Scale Integration (VLSI) Systems, 20(4) 755-759. doi: 10.1109/TVLSI.2011.2111392
Journal Title
Journal Volume
  • 20
Journal Issue/Number
  • 4
Rights Statement
Publisher
Peer Reviewed
Language
Replaces

Relationships

Parents:

This work has no parents.

Items