text.skipToContent text.skipToNavigation

Trellis and Turbo Coding Iterative and Graph-Based Error Control Coding von Schlegel, Christian B. (eBook)

  • Erscheinungsdatum: 19.08.2015
  • Verlag: Wiley-IEEE Press
eBook (ePUB)
111,99 €
inkl. gesetzl. MwSt.
Sofort per Download lieferbar

Online verfügbar

Trellis and Turbo Coding

This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework Advanced research-related developments such as spatial coupling A focus on algorithmic and implementation aspects of error control coding


    Format: ePUB
    Kopierschutz: AdobeDRM
    Seitenzahl: 528
    Erscheinungsdatum: 19.08.2015
    Sprache: Englisch
    ISBN: 9781119106326
    Verlag: Wiley-IEEE Press
    Größe: 18541 kBytes
Weiterlesen weniger lesen

Trellis and Turbo Coding

Chapter 1

1.1 Modern Digital Communications

With the advent of high-speed logic circuits and very large scale integration (VLSI), data processing and storage equipment has inexorably moved towards employing digital tech-niques. In digital systems, data is encoded into strings of zeros and ones, corresponding to the on and off states of semiconductor switches. This has brought about fundamental changes in how information is processed. While real-world data is primarily in "analog form" of one type or another, the revolution in digital processing means that this analog information needs to be encoded into a digital representation, e.g., into a string of ones and zeros. The conversion from analog to digital and back are processes which have become ubiquitous. Examples are the digital encoding of speech, picture, and video encoding and rendering, as well as the large variety of capturing and representing data encountered in our modern internet-based lifestyles.

The migration from analog communications of the first half of the 20-th century to the now ubiquitous digital forms of communications were enabled primarily by the fast-paced advances in high-density device integration. This has been the engine behind much of the technological progress over the last half century, initiated by the creation of the first inte-grated circuit (IC) by Kilby at Texas Instruments in 1958. Following Moore's informal law, device sizes, primarily CMOS (Complementary Metal-Oxide Semiconductors), shrink by a factor two every two years, and computational power doubles accordingly. An impression for this exponential growth in computing capability can be gained from Figure 1.1 , which shows the number of transistors integrated in a single circuit and the minimum device size for progressive fabrication processes - known as implementation nodes .

While straightforward miniaturization of the CMOS devices is becoming increasingly more difficult, transistor designers have been very creative in modifying the designs to stay on the Moore trajectory. As of 2015 we now see the introduction of 3-dimensional transistor structures such as thin FETs, double-gated FETs, and tunnel FETs, and it is expected that carbon nanotube devices may continue miniaturization well into the sub-10 nm range. In any case, the future for highly complex computational devices is bright.

Figure 1.1 : Moore's law is driving progress in electronic devices. Top left: A basic CMOS switching structure. Bottom left: Moore observed his "doubling law" in 1965 and predicted that it would continue "at least another 10 years."

One such computational challenge is data communications: in particular data integrity, as discussed in this book. The migration from analog to digital information processing has opened the door for many sophisticated algorithmic methods. Digital information is treated differently in communications than analog information. Signal estimation becomes signal detection; that is, a communications receiver need not look for an analog signal and make a "best" estimate, it only needs to make a decision between a finite number of discrete signals, say a one or a zero in the most basic case. Digital signals are more reliable in a noisy communications environment; they can usually be detected perfectly, as long as the noise levels are below a certain threshold. This allows us to restore digital data, and, through error correcting techniques, correct errors made during transmission. Digital data can easily be encoded in such a way as to introduce dependency among a large number of symbols, thus enabling a receiver to make a more accurate detection of the symbols. This is the essence of error control coding .

Finally, there are also strong theoretical reasons behind the migration to digital pro-cessing. Nyquist's sampling theorem, discussed in S

Weiterlesen weniger lesen