Did you take linear algebra? If you did, remember, given that if Hx = Dx, where A is a matrix, x is a vector, and D is a scalar (usually represented by lambda), then D is an eigenvalue of H, and x is an eigenvector of A.
Now think of the discrete case of a filter h and a signal x. if we feed x through h resulting in y, we have the discrete time convolution y[n] = sum( h[k]x[n-k] ). If we let x = Ae^jwn, a discrete time sinusoid, we see that y[n] = sum( h[k]*Ae^jw[n-k] ) = Ae^jwn * sum( h[k]e^[-jwk] ). Since the time index, n, does not appear in the summation, it is a constant! In fact, is is the DTFT of h! Let's call it D. So we have that y[n] = ( Ae^jwn ) * D(w) = x*H = Hx, or y = Dx. Look familiar? It's a little more complicated than the matrix case since in general either h or x may be infinite, but if you consider filtering with h to be the LTI operation represented by H, then we have y = Dx -> Hx = Dx. This is where the term 'eigenfunctions' comes from. This relates to why eigen-analysis is often called spectral analysis! x = e^jwn is like an infinite eigenvector of the system H with the eigen-spectrum (eigenvalues) being D(w), the discrete time Fourier transform!