Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Device Ro (1/gds) Change from Schematic to Extracted Post-layout

fpmkh0

Newbie level 6
Newbie level 6
Joined
Apr 18, 2021
Messages
11
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
118
Hi, for those with experince, can you please help me understand the reason for device output resistance (ro) change from scheamtic to post-layout? It's for a 7nm technology. I checked the current and it's almost the same. For the reference, Ids only -1%, gm is also almost the same. But I noticed ro droped by almost 17% post-layout. Any explanation? I checked multiple devices and I notice number of multipliers affects this. For example, for a device with 32 multiplier, ro droped by 7% while for a device with 256 multiplier, ro droped by 17%.
 
If your extraction includes resistance or inductance then you can
see source-degeneration degrade gm@OP, especially at higher
drain current. It's likely also that post-extract models (which tend
to play with the partitioning of what's "inside" and "outside" a
device, may embed "stuff" that makes a straight model:model,
device:device matchup show differences. Like, in RFIC flows I
knew, the post-layout model tree's subcircuits would have R
and L and C burdens while the schematic model did not (the
designer was responsible for best-guessing all that, in "paper
space").

You could test this theory. It comes down to cases in PDK
development.
 

LaTeX Commands Quick-Menu:

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top