- American Physical Society Sites
- Meetings & Events
- Policy & Advocacy
- Careers In Physics
- About APS
- Become a Member
By Andrew J. Viterbi
|Dr. Andrew Viterbi is a co-founder of QUALCOMM Incorporated. He has spent equal portions of his career in industry, having also co-founded a previous company, and in academia as Professor in the Schools of Engineering first at UCLA and then at UCSD, at which he is now Professor Emeritus. |
His principal research contribution, the Viterbi Algorithm, is used in most digital cellular phones and digital satellite receivers, as well as in such diverse fields as magnetic recording, speech recognition and DNA sequence analysis. In recent years he has concentrated his efforts on establishing CDMA as the multiple access technology of choice for cellular telephony and wireless data communication.
The origins of multiple access date back to Patent No. 7777 awarded in 1900 to Marconi for the "Tuned Circuit" which was the enabling technology for both Frequency Division Multiplexing (FDM) and Frequency Division Multiple Access (FDMA). (FDM refers to transmission of multiple sources from a single location by modulating each on a separate carrier sufficiently separated from the other, while in FDMA the sources and their respective modulated carriers emanate from different transmitters, generally not co-located.) FDM and FDMA are the only multiplexing and multiple access techniques, that can be used with both analog and digital transmission.
For digital sources, two alternative technologies have evolved for multiplexing and multiple access: time division (TDM and TDMA) and code division (CDM and CDMA). With the beginnings of the computer industry in the 1950's, TDM evolved naturally since it is a way to multiplex several parallel data streams generated in a single location into one serial data stream. Thus time division multiplexing is synonymous with parallel-to-serial conversion. TDMA, on the other hand, was used beginning in the 1960's for geosynchronous satellite networks of small numbers of large antenna earth stations.
Code division multiple access (CDMA) has a far different pedigree, also dating back to the 1950's. As its name implies, users' signals are isolated not by separate time or frequency slots, which are occupied in common by all users, but rather by unique underlying codes, which when decoded restore the original desired signal while (ideally) totally removing the effect of the other users' coded signals. For this ideal case the codes must be time-synchronized and orthogonal, meaning that any two users' codes must differ in half their symbols and agree in the other half. This synchronization in time is easily achieved for code division multiplexing, where all sources destined for all users are transmitted from the same location, such as a base station. For multiple access, on the other hand, time synchronization is generally not practical since users are separated by distances which change with motion. Additionally, multiple paths may produce different time shifted replicas of each user's transmitted signal and code. Thus for CDMA, users' codes are generally chosen to be non-repetitive over a very long period, which does not guarantee orthogonality over the shorter period of each user's transmission, but does ensure a small effect on the demodulators of other users.
An important side effect of code division is that each user's transmitted bandwidth is greatly enlarged by making the coded signal's symbol rate, or clock, run much faster than the digital data rate of the source. For example, if the data rate is 10K bits/sec, the code clock symbol rate may be 1Mbit/sec or 100 times as fast. The result is an occupied bandwidth approximately equal to the coded rate; hence the term "spread spectrum" is often used interchangeably with CDMA. This, in fact, better describes the origins of CDMA. As early as World War II but with greater intensity and sophistication beginning in the 1950's, spread spectrum was employed in military communications to protect against hostile interception and interference or jamming. If the enemy does not know the communicator's code, the latter's signal will appear merely as noise. More significantly, if the enemy tries to jam the transmission with any form of radio signal, the intended friendly receiver's demodulator in the process of decoding the desired signal will transform the hostile signal into a spread spectrum form approximating wideband noise. The effect is to reduce the hostile jammer's effectiveness by a factor known as the "processing gain" or "spreading factor" which is the ratio of the code rate to the original source's bit rate (100 for the example just given). Essentially, spread spectrum or CDMA is the "best" signaling modulation for even the "worst" form of jamming signal. (This is sometimes called the mini-max solution of a game between communicator and jammer.)
This historical military application, took on added importance with the proliferation of military geosynchronous communication satellites in the 1970's and 1980's, which are particularly vulnerable to jamming from almost anywhere. The first commercial applications of CDMA were also in transmissions to communication satellites because as the geosynchronous orbit space became more crowded and earth antennas became much smaller with consequently wider apertures, transmission to and from satellites began to interfere more severely with one another. Hence the interference suppression properties of CDMA made this the multiple access technology of choice.
Terrestrial mobile cellular telephony became the overwhelmingly pervasive multiple access application of the 1990's with approximately 400 million subscribers today and possibly over a billion by the end of the decade. The industry moved from analog modulation in the 1980's to digital modulation in the 1990's and now employs voice compression and advanced modulation and coding techniques to serve more subscribers per base station in the allotted frequency spectrum. Though the early impetus from the European cellular telephony standard of the 1980's was toward a hybrid TDMA/FDMA approach known as GSM, the establishment of a CDMA standard in 1993 has helped make CDMA the most rapidly growing technology, already serving over a quarter of the digital cellular population. Its technical advantages are particularly important as the number of users served by a base station increases and hence suppressing the effects of multipath, avoiding interference between users, and performing soft handover between base stations become ever more critical. And as bandwidth efficiency needed for high-speed access to the Web becomes essential in third-generation wireless systems, CDMA is expected to become the global standard.
©1995 - 2017, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.