Bandwidth of a channel

Status
Not open for further replies.

iVenki

Member level 3
Joined
Apr 2, 2011
Messages
60
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,771
What is the bandwidth of a channel? What is the relationship between the bit rate and the bandwidth of a channel?
 


**broken link removed**

:wink:
 

What is the meaning of bandwidth of a channel?
I studied it is the width of the frequency band used to transmit the data. It is the difference between the upper and lower frequency. My question in what will happen if you transmit the data at a frequency higher than the upper frequency?
Bit rate is the number of bits transmitted per second.
But I can't understand how bit rate and bandwidth are getting related? I don't need formula. I just need an explanation. How do you say bandwidth affects bit rate?
 

Bandwidth can be defined in several terms but usually means a significant amount (99%+) of the power of a signal is contained within a given bandwidth. The criteria of amount of a signal's power within a given bandwidth definition is usually either bandwidth for given amount of distortion degradation to signal, or bandwidth in terms of leakage power amount causing interference outside the specified bandwidth.

There is not a direct relationship between bit rate and bandwidth without specifying number of levels in a symbol period, which is described by Hartley theorem. Without a restriction of signal to noise, you could have a very high number of levels (M) to each symbol yielding bit rate =< 2*Bandwidth * log2(M), which Harley added to Nyquist original binary theorem. Shannon tied in the statistical noise aspect into bit rate =< 2*Bandwidth * log2 (1+ (S/N))
 
Last edited:

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…