Hamid Ullah Jan
Newbie level 3
Hi everyone!
I want to read the voltage signal in every 100 msec and store it in a file using a daq card.
The problem is when i configure the daq assistant in labview, i select "N sample = 10" and sampling rate 1000 Hz. Logically this should work as the daq card should take reading every 100 msec and 10 readings in a second. But what happens actually, i got 10 readings for first 10 msecs and reaming 990 msec reading are missed, so i changed the sampling rate too from 1000 to 500 and than 100 but no change.
Can anyone please explain me the concept of "N samples vs Sampling rate vs actual time" in reference of labview as this is not explained very clearly anywhere?
And what will be the effect if we change the sampling rate?
Thanks
I want to read the voltage signal in every 100 msec and store it in a file using a daq card.
The problem is when i configure the daq assistant in labview, i select "N sample = 10" and sampling rate 1000 Hz. Logically this should work as the daq card should take reading every 100 msec and 10 readings in a second. But what happens actually, i got 10 readings for first 10 msecs and reaming 990 msec reading are missed, so i changed the sampling rate too from 1000 to 500 and than 100 but no change.
Can anyone please explain me the concept of "N samples vs Sampling rate vs actual time" in reference of labview as this is not explained very clearly anywhere?
And what will be the effect if we change the sampling rate?
Thanks