I'm doing a rough complexity estimate of an algorithm. What is a reasonable
value of the cost of a real multiplication relative a real addition? That
is, how many real adds is a real multiplication worth?
I mean I may use a fixed point hardware platform in terms time consuming, hardware complexity and etc.
It depends, in case you are doing it on a DSP chances are the cost is the same: 1 instruction slot. It might depend on DSP. Same thing for general purpose CPUs.
If you are designing hardware than I would say that a multiplier can be represented as 5*cost of adder. It will of course depend on width of operands as well as the type of multiplyer/adder.