By the mathematical definition of 'Specific Detectivity',
D*= √AΔf / NEP
where NEP is the noise equivalent power, A is the detector's active area, and Δf is the bandwidth.
Its is said that D* is normalized with A and Δf, and D* is therefore independent of A and Δf, D* is the intrinsic property of the detector material.
But I found in the specification of some photovoltaic IR detectors made of (Hg,Cd,Zn)Te semiconductor. by looking at its spectral response, the Specific Detectivity decreases as the operational wavelength of the detector increases, please see the attached image, between curve 3 and 8.
Can anyone help me to explain why it is so? Thanks