Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the difference between the resolution and sensitivity in sensor science

Status
Not open for further replies.

Y.T_comp

Newbie level 6
Newbie level 6
Joined
Sep 29, 2011
Messages
14
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,291
Location
syria
Activity points
1,383
As-Salāmu `Alaykum 1
Hi all

I have new question and I think this forum is the best place to this question .

Anyway , I have a miss understanding about tow terms which are :
sensitivity and resolution .

I read from a NI website about them 2 ,and I didn't find a difference between them .

But I conclude that(maybe I don't know if i'm right ) that :

sensitivity : is the smallest amount that instrument can detect and don't have to change the output (I mean that it effects but don't have to change output ).

resolution : is the smallest amount that instrument can detect and have to change the output (I mean that it effects and have to change output ).

Am I right tank you .
 

Sensitivity=the change of output (measured value)
per unit of input (measurand). Sensitivity is the slope of
the static calibration curve at a given value of the input.

accuracyanderror_sensitivity.png

Resolution=the smallest increment of change
in the measured value that can be determined
from the readout or recording instrument

-------------

Sensitivity is the smallest unit of a given parameter that can be meaningfully detected
with the instrument when used under reasonable conditions.
For example, assume the sensitivity of a DMM in the volts function is 100 nV.
With this sensitivity, the DMM can detect a 100 nV change in the input voltage.

For a noise-free DMM, resolution is the smallest change in an input signal that produces,
on average, a change in the output signal. Resolution can be expressed in terms of bits,
digits, or absolute units, which can be related to each other.

I suggest to read these for more info....
Digital Multimeter Measurement Fundamentals
http://zone.ni.com/devzone/cda/tut/p/id/3295


**broken link removed**
Section 1 / page 10

LowLevMsHandbk_1_Accu_Term.png

Example (pressure sensor):

Resolution defines the ability to distinguish one reading from another.
For a digital pressure gauge the resolution is normally referred to as the number of readable digits,
e.g. 2 bar range with a 5 digit display would have a 0.1 mbar resolution.

A strain gauge pressure sensor without any amplification is described as having a signal output with infinite resolution,
since there is no signal conditioning to limit it.

Accuracy refers to the worse case error in measuring a particular reading compared to the actual value.
If the resolution is of a similar value to the accuracy it should be included in the accuracy statement,
since the true uncertainty of reading should also encompass readability.

The folllowing are examples of how resolution and accuracy are described in specifications of pressure measuring equipment:

Digital Pressure Gauge
Pressure Range: 200 bar
Accuracy: 0.05% Full Scale = 100 mbar
Display Resolution: 5 digits = 10 mbar

Amplified Pressure Transducer (0-10Vdc out)
Pressure Range: 200 bar
Accuracy: 0.25% Full Scale = 500 mbar/25mV
Digital to Analog Amplifier Resolution: 0.002% Full Span = 4 mbar/0.2mV
 
Last edited:
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top