you might be confusing different concepts. For example, to get a 1 Hz resolution, you need to measure for a 1 second time period for "normal" frequency counters. There have been some commercially manufactured frequency counters that had high frequency internal oscillators inside (500 MHz for example) and your input waveform gated the counting of the internal 500 MHz signal, and the number of cycles counted (and partial cycles) could be used to extrapolate the input frequency....even though you did not actually count it for a long enough time. There are probably DSP related algorithms that could be used to do a similar thing, give you higher resolution than your measurement interval should allow. Some of these methods, especially the hardware oriented ones, have bias errors due to the way gates square up a signal, and they have to be calibrated out.
Now, as far as the phase noise of the signal you are measuring...well phase noise could be thought of as instantaneous frequency error, so if you use too small of a measurement window, and the source is too noisy, you could be off pretty far in measured vs. long term "mean" frequency. To solve that...one typically averages many measurements until the spurious deviations are sufficiently low.