Why would you like to go that low? It is far from optimal at that point, the generated circuit is going to be more power hungry and slower than the same circuit placed at 50% utilization
"How the intial utilisation is decided at PD stage."
How about you synthesize things in your technology, then look at std cell area from synthesis report.
This can be used to assume 50-60% beginning util and that can give you the area as
Why would you like to go that low? It is far from optimal at that point, the generated circuit is going to be more power hungry and slower than the same circuit placed at 50% utilization
"How the intial utilisation is decided at PD stage."
How about you synthesize things in your technology, then look at std cell area from synthesis report.
This can be used to assume 50-60% beginning util and that can give you the area as
careful with rules of thumb. it changes form tech to tech and stack to stack. having been doing this for 2 decades, I can say I have seen one tech where we would start from 40% and move up. reaching 50% was hell. In another tech, the starting point was 70%. reaching 80% was doable.
But in any design you need to at least find the breaking point as your due diligence. Start wherever, but build up to find where you start seeing shorts increase, timing crap out, runtimes explode. All part of knowing the design, this should be true regardless of tech node.
Low utilization means larger die size and fewer possibles per wafer
leading to higher base cost per die.
If die cost is not the significant component of product cost then
you might elect to trade that for synthesis "wall time". But doing
it up front as a setting, rather than (say) putting a couple of
test cases into the hopper, seems like cart before horse.