fundamental question regarding bandwidth

Status
Not open for further replies.

yykcw

Newbie level 3
Joined
Dec 1, 2014
Messages
3
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Visit site
Activity points
24
What I've learnt so far is that, if we change a time-domain signal into frequency domain, its bandwidth shows the range of frequency of radio waves that required to generate such signal. However, in optic fiber, a small spectral width of the light source represent larger bandwidth, I just don't understand the physical meaning of the bandwidth there and how can it relates to the data rate.
 


The link between the time and frequency domain is the Fourier Transform.
In optical fiber and elsewhere the media bandwidth is the available bandwidth. Data rate and information transmission requires to modulate a carrier frequency. Different modulation types generate different bandwidths for the same data rate. The generated spectra are then organized to fill the available spectrum in a propagation medium.
 

In reality, even with no modulation (no information, so bandwidth should be zero) the actual spectra of the emission will have some nonzero bandwidth, just due to the nonideal carrier generation. In optics, this bandwidth may actually be larger than the bandwidth associated with useful modulation. For example, a TV remote only has tens of KHz of modulation bandwidth, but the actual spectra of the LED's emissions is far wider.
 

But what I don't understand is that, for example, I used a 1550nm light to be the source and use BPSK to modulate it, after passing an optic fiber, dispersion makes the bandwidth reduced, then what dose this reduction actually mean? I don't understand if there is only a single frequency of light forming a signal, then what is the purpose to find the frequency range that generate such signal?
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…