Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Why condition of linear phase is important in filters?

Status
Not open for further replies.

wajahat

Advanced Member level 4
Full Member level 1
Joined
Feb 3, 2006
Messages
102
Helped
3
Reputation
6
Reaction score
1
Trophy points
1,298
Activity points
2,097
If phase of a filter is linear than no distortion would occur in the o/p signal,HOW?
 

Band Pass filter: Constant gain, phase linear with w.


Phase no linear implies signal distortion (linear), group delay isn't constant with w.
 


    wajahat

    Points: 2
    Helpful Answer Positive Rating
wajahat,
In general, phase linearity is a parameter that is defined for the passband of a filter. If the phase shift of a filter is linear with respect to frequency, then every frequency component of the signal will be delayed by an equal amount. The result is an output signal that is delayed, but not distorted. Group delay is defined as the neagative slope of the phase with respect to frequency. If the group delay is constant, tis implies phase linearity.
Regards,
Kral
 

    wajahat

    Points: 2
    Helpful Answer Positive Rating
    V

    Points: 2
    Helpful Answer Positive Rating
Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top