Home > Speakers >

Fredric J Harris

Professor harris is at the University of California San Diego where he teaches and conducts research on Digital Signal Processing and Communication Systems. He holds 40 patents on digital receiver and DSP technology and lectures throughout the world on DSP applications. He consults for organizations requiring high performance, cost effective DSP solutions. He has written some 260 journal and conference papers, the most well-known being his 1978 paper “On the use of Windows for Harmonic Analysis with the Discrete Fourier Transform”. He is the author of the book Multirate Signal Processing for Communication Systems and has contributed to several other DSP books. His special areas include Polyphase Filter Banks, Physical Layer Modem design, Synchronizing Digital Modems and Spectral Estimation He was the Technical and General Chair respectively of the 1990 and 1991 Asilomar Conference on Signals, Systems, and Computers, was Technical Chair of the 2003 Software Defined Radio Conference, of the 2006 Wireless Personal Multimedia Conference, of the DSP-2009, DSP-2013 Conferences and of the SDR-WinnComm 2015 Conference. He became a Fellow of the IEEE in 2003, cited for contributions of DSP to communications systems. In 2006 he received the Software Defined Radio Forum's “Industry Achievement Award”. He received the DSP-2018 conference's commemorative plaque with the citation: We wish to recognize and pay tribute to fred harris for his pioneering contributions to digital signal processing algorithmic design and implementation, and his visionary and distinguished service to the Signal Processing Community. The spelling of his name with all lower case letters is a source of distress for typists and spell checkers. A child at heart, he collects toy trains and old slide-rules.

Resampling Filters: Interpolators and Interpolation

Status: Available Now

The first time I had to design an interpolator to change the sample rate of an existing time series from one sample rate to another sample rate was in the early 1960s. A group of engineers were determining the acoustic signature of a ship in San Diego Harbor. Two small vessels circled the ship and collected samples of the ship’s sounds to be cross correlated off-line in a main frame computer. Imagine our surprised response when we realized that the two collection platforms had operated at different sample rates to collect their versions of the sampled data signal: 10 kHz and 12 kHz! You can’t correlate time sequences that have different sample rates! It was an interesting learning process.

My Webster’s Second Collegiate Dictionary lists, in its third entry, a math definition of interpolate as: “To estimate a missing functional value by taking a weighted average of known functional values at neighboring points.” Not bad, and that certainly describes the processing performed by a multirate filter. Interpolation is an old skill that many of us learned before the advent of calculators and key strokes replaced tables of transcendental functions such as log(x) and the various trigonometry functions. Take for example the NBS Applied Mathematics Series, AMS-55 Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables by Abramowitz and Stegan. This publication contains numerous tables listing functional values of different functions, sin(θ) for example, for values of θ equal to …40.0, 40.1, 40.2,….etcetera. Interpolation is required to determine the value of sin(θ) for the values of θ between the listed values. Interpolation was such an important tool in numerical analysis that three pages in the introduction of the handbook are devoted to the interpolation process. Interpolation continues to be an important tool in signal processing and we now present and discuss the DSP filtering description of the interpolation process.

The ability to change sample rate of a sequence to another selected sample rate has become the core enabler of software defined radios and of sampled data communication systems. Synchronizing remote clocks on moving platforms, adjusting clocks to remove clock offsets due to environmental, manufacturing tolerances, and Doppler induced frequency shifts are but the tip of the many things we accomplish with arbitrary interpolators. Let’s have a cheer, Here, Here, for interpolators!

Go to Session


Polyphase Analysis and Synthesis Filter banks: capabilities and implementation

Status: Available Now

Two papers related to this workshop and made available by fred harris:

Polyphase Analysis and Synthesis filter banks, a very important segment of the multirate signal processing community, are the most incredible signal processing algorithms. Your first reaction to your understanding them is: “I’ll be darned!” Your second reaction is: “I can hardly wait to tell all my friends about this!” Do you know about these things?

Let’s start with the analysis filter bank which has a dual structure called the synthesis filter bank. Each does the opposite of the other. The analysis channelizer processes a sampled data input signal spanning a wide frequency band containing many contiguous narrow bandwidth channels. The result of that processing is a set of narrow bandwidth signals translated from their original centers to baseband and low-pass filtered to their channel bandwidths to separate them from their neighbors and further down sampled to a rate commensurate with their reduced bandwidths. This process for a single channel is called a digital down converter (DDC). The remarkable property of the analysis channelizer is the cost of M (say 100) channels is only about the cost of 5 channels. Amazingly the process occurs in a completely different manner and order of what you would imagine! Rather than down convert, filter, and reduce sample rate, the sample rate is reduced on the way into the filter bank and the processing is performed at the reduced output rate instead at the high input rate. 

If we were a fly on the wall we might overhear this conversation between a potential buyer and the salesperson in the polyphase analysis filter bank store. The customer asks “What will a single channel DDC cost me?” The salesperson answers “It will cost you $10”. The customer then asks “What will 10 equal BW channels of DDCs cost me?” The salesperson answers “it will cost you $100 but if you are interested, we have a special this week; we have a 100 channel DDC for only $50. For that price, you can compute all 100 channels, throw away 90 of them and still have your 10 channels at a reduced price!” Which option do you think the customer will buy?  Have we caught your attention? 

There is surely another store in town that sells synthesis filter banks. These banks up sample many baseband narrowband signals a higher sample rate and translates the baseband signals to selected center frequencies to form a composite broadband spectrum. These are digital up converters (DUCs).  The two filter banks are duals of each other; one uses aliasing caused by down sampling to translate all the band center signals to baseband and a clever trick to separate the aliases while the other uses aliasing caused by up-sampling to translate all the baseband signals to selected band centers and the same trick to perform the dual task of separating the up-sampled aliased spectral bands.

We will review the signal processing sequence of the M-path analysis and synthesis channelizers. We will then go through all the steps to implement the MATLAB realizations of the same and illustrate performance and methods of verifying its operation. This is a process you have to do three or four times till it finally clicks. I have former students contact me and ask “Remind me why we did this thing at this point in the script?” Reset time!

Go to Session


Polyphase Wide-Bandwidth Filters Implemented with Order of Magnitude Workload Reduction: Capabilities and Implementation

Status: Available Now

We examined Polyphase Analysis and Synthesis filter banks in an earlier workshop. The two filter banks are duals and can operate independently of each other. In this workshop, we use both banks in a tightly coupled manner to synthesize broadband filters with an order of magnitude workload reduction. For this design process, the filter bandwidths are a large fraction of the sample rate. Since the target filter specification has a wide bandwidth, it would seem that the signal processing that we conduct here can’t be the same as that used in the analysis filter banks. That is we can’t reduce the sample rate to the reduced bandwidth of the signal and operate the script at the reduced clock rate as we did in channelizer designs. In fact, we can do that! We can form a filter with a wide bandwidth from a set of narrow bandwidth fragments of the wide bandwidth system by using the perfect reconstruction properties of the analysis channelizer’s Nyquist segments. The synthesis channelizer seamlessly reassembles the desired wide BW filter from multiple contiguous narrow BW fragments formed by the analysis channelizer. The process trivially accommodates sample rate changes if there is a BW reduction in the assembled band as well as frequency offsets and Hilbert transforms. The remarkable attribute of this process is the order of magnitude reduction in computational workload of the composite processing chain relative to the direct implementation of the same process.

We will build the MATLAB processing chain of the Analysis and Synthesis filter banks and then demonstrate variations of how they interact to simulate variable BW, variable sample rate, and variable frequency shift operations

Go to Session


Green FIR Filters with Large Ratio of Sample Rate to Bandwidth

Status: Available Now

This presentation will show you how to design and implement narrowband filters with more than an order of magnitude reduction of workload. I was recently challenged to reduce the workload for a 301 tap low pass FIR filter with sample rate 50 times the bandwidth. After my first approach in which I reduced the workload to 21 multiplies I wondered by how much could we reduce the workload? I finally stopped playing with the question when I reached 6 multiplies, which is a 50-to-1 workload reduction. The technique we present usually reduces the workload by a factor greater than 10. The only requirement to apply these techniques is that there be a large ratio of sample rate to bandwidth. Once we learn the simple trick to accomplish this reduction we then pose the next question: can we achieve similar reduction in workload when there is not a large ratio of sample rate to bandwidth? The answer surprisingly is yes? We will share the recipe for the secret sauce so you too will know how wideband filters can also be implement with more than an order of magnitude workload reduction. How about a pair of 1400 complex tap filters replaced with 100 real multiplies?

When I first started showing folks how to build FIR filters with an order of magnitude workload reduction, no-one seemed interested in clever solutions. I realized I had a marketing blind spot. I fixed that blind spot and now tell folks let me show you a green solution to your problem! There is hardly any room left on the bandwagon. 

Go to Session


The DSP Biquadratic Recursive Filter: A Fox in the Hen House

Status: Available Now

When we studied active analog filters we were taught that the biquadratic second order filter was the work horse of active filter design. What made it so was that fact we could form second order polynomials in both denominator and numerator with real coefficients. We also learned that when we performed sensitivity analysis reflecting root shifts with component value variation due to tolerance spreads that lower order polynomials had reduced sensitivity levels. We learned active filters should be implemented with multiple second order filters and possibly one first order filter. Control folks also learned this lesson. That was good perspective for a designer to have.

When we started to implement high order recursive filters in DSP land we followed the standard understanding that the sampled data biquadratic filter with decoupled second order denominator and second order numerators offered us the same capabilities, complex roots with real coefficients and low sensitivity to root shifts due to coefficient quantization. We were so pleased that the carryover from active analog filters to sampled data filters we failed to notice that it was not true! We let the Fox in the Hen house without realizing what we did.

The fox comes out to play when we try to form IIR low-pass filters with a large ratio of sample rate to bandwidth. What we learn is that it just doesn’t work! We need an alternate architecture or we should stop designing recursive filters with very small bandwidths relative to sample rate. One I see all the time is a 30 Hz wide low-pass or high-pass filter running at 48 kHz sample rate. Have you run into that? Did it take long for the hurt to go away when you found out your design didn’t work?  We will discuss how to fix the problem and make the fox go away.

Go to Session


Live Q&A with fred harris - The DSP Biquadratic Recursive Filter: A Fox in the Hen House

Status: Available Now

Live Q&A with fred harris following his talk titled "The DSP Biquadratic Recursive Filter: A Fox in the Hen House"

Go to Session


Live Q&A with fred harris - Green FIR Filters with Large Ratio of Sample Rate to Bandwidth

Status: Available Now

Live Q&A with fred harris following his talk titled "Green FIR Filters with Large Ratio of Sample Rate to Bandwidth"

Go to Session


Things We Should Not Do In Future Radios, (Future Designs Should Not Include Past Mistakes) (2020)

Status: Available Now

Wireless technology is a shining example of a disruptive innovation that has changed society in remarkable ways. The innovation has altered how people communicate, how people access information, how people are entertained, and how people conduct and schedule their social lives. Every human activity advances and grows through a number of influences. One is experience, one is market forces, another is effective education, and yet another is common wisdom. Common wisdom is entrenched perspectives and levels of understanding accepted by the community as guide posts of the process. In fact there are many examples to be found in the wireless community of common wisdom being faulty. Samuel Clemens’ comment “It ain’t what you don’t know that gets you in trouble, it’s what you know for sure that just ain’t so” The wireless community is not free of entrenched faulty common wisdom which is passed on to successive practitioners of the art. Universities are just as liable as industry for not examining and questioning common wisdom. In this presentation we examine the evolution of wireless technology from the early days through now and show how a number of wisdoms can be shown to not be wise but never-the-less have become entrenched in the fabric of our wireless technology

Go to Session


Multirate Polyphase Filters and Filter Banks, (GREEN Technology, also known as DSP Magic) (2020)

Status: Available Now

Recently, someone posted a question on a DSP blog I visit occasionally. How does one design a very narrow bandwidth low pass filter? One version of the problem is a filter with 10 Hz wide pass band, a 10 Hz wide transition band, and a 1 kHz sample rate. Stopband attenuation >80 dB with passband ripple <0.01 dB. This a very bad combination: low transition bandwidth with high sample rate! I think students post their homework problems on the blog so I seldom volunteer to do their homework. I did however read the many suggestions posted on the blog submitted by regular subscribers to the blog. They were interesting to read but nothing clever and of limited value. Some were just plain silly, but to quote a famous line, “who am I to judge?” The consensus was that some problems are hard and require lots of resources, this is one of them! All it takes is lots of filter coefficients and lots of multiply and adds. 405 taps seemed to be about the right number. When I read one suggestion from someone I know at Westminster University in London, I simply had to throw my hat in the ring. It then became a game: how small could you make the filter and still satisfy the specifications? For a week I submitted daily solutions requiring fewer and fewer coefficients. I started at 38 M&A per input sample and I stopped when I reached 6 M&A per input sample!

The presentation will show how to build narrowband filters with more than an order of magnitude reduction of workload. The only requirement is that there be a large ratio of sample rate to bandwidth. Once we learn the simple trick to accomplish this reduction we pose the question, Can we achieve similar reduction in workload when there is not a large ratio of sample rate to bandwidth? The answer surprisingly is yes! We will share the recipe for the secret sauce so you too will know how wideband filters can also be implement with more than an order of magnitude workload reduction. How about an I-Q filter pair with 1400 taps per arm replaced with a resampling filter requiring only 100 real multiplies?

Go to Session


Live Q&A Discussion - Multirate Polyphase Filters and Filter Banks, (GREEN Technology, also known as DSP Magic) (2020)

Status: Available Now

Live Q&A session with fred harris following his talk titled 'Multirate Polyphase Filters and Filter Banks'

Go to Session


Live Q&A Discussion - Things We Should Not Do In Future Radios (2020)

Status: Available Now

Live Q&A session with fred harris following his talk titled 'Things We Should Not Do In Future Radios'

Go to Session