The way you have originally presented the problem, then the answer is no, for the reason weetabixharry initially mentions above. If you now say signal is BOTH frequency AND amplitude limited, then the answer is yes. I can think of a situation where both Fx and Ts are really large, so that any [X(t+Ts)-X(t)] is the same as the amplitude limits of X(t), meaning its predictable.
If however, Fx < 1/2Ts , then biggest change between samples should be less than the AMplitude limits, and predictable. Maybe you should present your problem a little more clearly. I have assumed signal X(t) , and its frequency representation is an analog 'real' form (after all, its a real application) .
This is quite different to saying its a set of digital samples, with Fx < Nyquist.