Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Low Pass Filter Design

Status
Not open for further replies.

JiL0

Member level 1
Member level 1
Joined
Jan 29, 2004
Messages
36
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
444
phase delay low pass filter

Say if i want to design a LPF for input signal ranging from DC to 1kHz. In that case, I would design a LPF with a corner frequency at 1kHz right? But by doing so, would the phase drop near the corner frequency be a problem as at the corner frequency, the phase has already drop at least 45 degrees.
 

I would make the 3dB point higher than 1kHz
 

The generally used method is to make the pass band wide enough to have the phase shift you want (read that group delay) and then increase the number of poles to get the attenuation you need in the stop band. Do not be limited by the classical tables. Software programs can design Chebychev and Elliptical filters with 0.001 dB ripple in the pass band. This will be degraded by the Q of the components to about 0.01 dB.
 

Is it right to say that phase shift will distort the signal? Is there a limit on the amount of phase shift so that the signal is still recognisable?
 

The phase shift will cause delay. That's why we're concern about the group delay for a filter.
 

JiL0 said:
Is it right to say that phase shift will distort the signal? Is there a limit on the amount of phase shift so that the signal is still recognisable?

Phase shift is delay. By all means all filters delay.

Phase shift does not mean signal is distorted by default, but it means IF phase is linear with respect with frequency signal IS distortionless and is not distorted at all.

And if phase is not linear then signal is distorted. This means also even for the case above distortion not to be understood as change in amplitude.
 

I thought a signal is made up of many sine wave of different frequency and different amplitude add up together. So when there is a delay to some sine wave (like for the higher frequency ones)... doesnt that means that when they added up... it wont be the original signal?

So sorry... i didnt go into the mathematics in detail(not good at it).... but just trying to picture it in layman terms.
 

Group delay characteristic of transfer function gives delay of particular frequency in pass band. The shape of square wave signal is distorted when all components of signal has not the same delay on output.
 

One way to visualize that different phase shifts at different frequencies do not distort is to think of a piece of ideal transmission line of length L. You can visualize that all signals will appear at the output of the line after a delay time D and the waveform will not be distorted.

The delay time D is L/V where V is the velocity. From this delay you can calculate the phase shift for any frequency and see that in radians or degrees it is different for each frequency. You can then calculate that the differential phase shift delta phase/delta frequency for any two frequencies you calculate is the same value. If the phase was calculated in radians and the frequency in radians per second, the result of this calculation will be in seconds of delay and should match the original delay calculation.
 

Ideal transfer function has linear phase characteristis with frequency. Corresponding group delay is constant. That means that all frequency component of signal has the same delay at output. Composition of complex signal (Fourier) is not affected by phase characteristic. Complete signal is only delayed and has original shape. Real transfer functions (filters) does not have linear phase characteristic especially at the edge of pass band. Particular frequency components has different delay on output so the shape is distorted.
To correct such errors of phase characteristic at filters phase equalizers are used. Absolutely errorless result can not be acheived.
 

Hi all.
How about digital filters.
The cutoff frequency required is quite low (1kHz).
I thnk a dsp could implement a digital filter easily for this frequency range.
A FIR filter has, if I'm not wrong, linear phase response and could then do this job.
The problem would be that for a good attenuation the FIR would require much more processing (higher order) than IIR (not linear phase, similar to analog filters).
Maybe a PIC running at 20MHz could also do this job for not so high orders.
S.
 

JiL0 said:
So sorry... i didnt go into the mathematics in detail(not good at it).... but just trying to picture it in layman terms.

OK here is how you are going to visualize it. Let suppose you have 800Hz sinusoid. Assuming ideal filter output would be the sinusoid sin(800-30) let say 30 degrees phases lag. So it lags in phase. It is delayed version of what was inserted in filter but with 30 degrees lag.

Now take another sin*600 sinusoild of 600 Hz. The ouput of the filter would be one sinusoid but having of 25 degrees lag. If you can make a simple linear relationship between these phase lags with respect to frequency you can predict exactly the phase delay of another sinusoild e.g. sin400Hz what would be the phase lag of this 400Hz signal.

so it means the filter you designed does not distort the signal at all. it means the output of the filter blocks unwanted signals but at the same time what coming from filter is the same signal which entered into filter and delayed but with also with a predicteable phase lag.

chears,
zanove.
 

oh I must mention also, amplitude of the signal comming from the filter is a different story.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top