One time our house voltage rose to 130 VAC after a blackout. (Normal is 115-125V.)
My inverter would not charge the batteries. Apparently it needed for the house voltage to be within a certain range.
I tried a makeshift strategy to reduce the voltage. I ran a long extension cord to the inverter. I plugged in both the inverter and a space heater. This worked to bring down voltage at the inverter, so that it charged the batteries.
The extension cord was light duty. It heated up a little, due to carrying heavy current.
- - - Updated - - -
may be by connecting something in series etc...
The obvious way is to install a resistive load in series. Possibly a length of nichrome wire from a toaster or space heater.
It will be best if you can make it adjustable. A metal clamp or something similar. This could be trouble-prone (particularly if it sparks). You should install safeguards to prevent fire, and contact with high voltage.
Other components you can try putting in series are a capacitor, or an inductor. Values must be custom-adjusted for the load. It is easy to make a mistake. You should monitor what's going on, and make sure that no part gets exposed to overly high voltage.
Suppose your inverter draws 2A while charging the batteries.
This simulation shows which values will drop 260 VAC to a normal level.
In the case of a capacitor, it must be non-polarized type, and rated for the maximum voltage it will be exposed to.
In the case of an inductor, it may be possible to construct an adjustable inductor. It must have a substantial amount of metal in its core.
A safer way is an auto-transformer. (I believe this is what post #2 refers to.) It has the secondary connected to the primary, and is less expensive than a transformer. It has taps which yield different volt levels. It must be rated for your voltage and current needs.