CyberByte
Newbie level 1
I'm kind of new to signal processing and I was wondering what the frequency of a signal of alternating frequencies would be. The signal would contain full periods of two different frequencies, let's say 1 and 2 Hz to keep things simple. The signals can be represented like this:
where the signal repeats and each time unit lasts 0.25 seconds, so the signal at times 0 and 1 is 2 Hz and between 2 and 5 is 1 Hz (also see image).
Now, do I just have two signals of 1 and 2 Hz respectively, or did I somehow create a signal of 1.333 Hz?* According to Wikipedia I'm supposed to determine frequency by dividing the number of periods by the time passed. If I do this after 0.5 seconds (2 time units), the frequency is 2 Hz, after 1.5 seconds it would be 1.333 Hz and if I ignore the first 2 time units (or I would have started the signal with the 1 Hz sequence) it should be 1 Hz. So I guess it may not be very clear cut and it depends on the application...
I basically want to make part of my computer screen flicker at a certain frequency, but my hands are tied by the refresh rate and I was wondering if I could just get the desired frequency by alternating between frequencies that the monitor can natively generate. I need to know if people would perceive the above case as a flicker of 1.333 Hz, or something else (but this is not important enough to warrant a full-blown experiment).
Thanks for your time!
* It's 1.333, because the 1 Hz period lasts twice as long as the 2 Hz period. In order to get 1.5 Hz you would need a signal like this: + - + - + + - -.
Code:
time: 0 1 2 3 4 5 6 7 8 9 10 11
signal: + - + + - - + - + + - -
where the signal repeats and each time unit lasts 0.25 seconds, so the signal at times 0 and 1 is 2 Hz and between 2 and 5 is 1 Hz (also see image).
Now, do I just have two signals of 1 and 2 Hz respectively, or did I somehow create a signal of 1.333 Hz?* According to Wikipedia I'm supposed to determine frequency by dividing the number of periods by the time passed. If I do this after 0.5 seconds (2 time units), the frequency is 2 Hz, after 1.5 seconds it would be 1.333 Hz and if I ignore the first 2 time units (or I would have started the signal with the 1 Hz sequence) it should be 1 Hz. So I guess it may not be very clear cut and it depends on the application...
I basically want to make part of my computer screen flicker at a certain frequency, but my hands are tied by the refresh rate and I was wondering if I could just get the desired frequency by alternating between frequencies that the monitor can natively generate. I need to know if people would perceive the above case as a flicker of 1.333 Hz, or something else (but this is not important enough to warrant a full-blown experiment).
Thanks for your time!
* It's 1.333, because the 1 Hz period lasts twice as long as the 2 Hz period. In order to get 1.5 Hz you would need a signal like this: + - + - + + - -.