The input impedance of the gate of the output FETs, is in the order of 500 PF, which at 5 MHZ represents an impedance of 60 ohms, so a few volts of drive into this impedance represents a current of 2/60 = 30 mA.
More drive?, as the output voltage is controlled by the negative feedback as the gain decreases, the internal voltages increases. Vout = G X Vdiff. where Vdiff is Vin-Vnfb, so if G falls, for constant Vout, Vdiff increases, as Vin is the same, the Vdiff must be higher, i.e. the actual voltage being amplified inside the amp.
I would suppose the current consumption, has three main components, 1, the steady DC consumption of the DC bias of the early stages, then the DC losses of the output stage driving a load (capacitive output currents must be considered). I think this is the maximum output current limit. Then the AC losses, every stage will run out of gain at some frequency, but I think its the single time constant of the output stage that puts in more losses (as above) and effectively determines the amplifiers frequency limit.
I have never seen any data on this, I suspect that it is accidently referred to in IC data as " able to deliver 1 V into 50 Ohms +500 PF at 10 MHZ" . From what you have found, this sort of spec is as likely to be determined by the IC dissipatation as much as its slew rate.
Frank