Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

0.18u max drain bias for hot carrier stress

Status
Not open for further replies.

des3glo

Newbie level 3
Newbie level 3
Joined
Nov 24, 2005
Messages
4
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,341
relxpert

Hi, I am testing a 0.18tsmc process and really need to know what the maximum drain bias I can stress the devcies to.

I am trying to conduct some hot carrier stress tests. With the core voltage being 1.8v I am thinking that the maximum voltage is going to be around 2.2. Does anyone have any information that can help me or has anyone carried out hot carrier stress tests on such a process.

Any help would be much apprieciated,

Thanks Gethin
 

how to use bsimproplus

TSMC has a document called "0.18um Logic 1.8V/3.3V ACHC Models" that has a model for HCI degradation. The procedure is - to request the document from TSMC, to measure or simulate the substrate leakage current of the devices under question, and then apply the magic formula from that document that will calculate the extent of degradation.
 

tsmc 0.18 gate breakdown voltage

I'm actually working on the same problem right now, and I just discovered that Spectre has HCI degradation analysis (see "Virtuoso Spectre Circuit Simulator User Guide"). I'm going to try this today.
 

cadence relxpert

Thanks for these responces evi. I guess I just have to apply to TSMC by email to get that docuument?

Are you working on 0.18 devices?? I plan to do a normal operating condition test for 24 hours to see what if any degredation there is. Then apply a stress test i.e. 2.0v on drain 1.0 on gate at max ISub over a shorter period, 10000 sec and take measurewments every 1000.

Are you looking at GIDL at the same time??? I am also taking the test from -0.5 to 1 on the gate to see if the off current changes after the hot carrier stress!

Added after 9 minutes:

I have been on TSMC website but cant find a contact to ask about that document. Any chance you have a contact email or that document you can possibly email me.
 

maximum voltage for 0.18um

OK, let's keep in touch. Im working with TSMC 0.18 1P6M mixed signal process. Here is where I am now:

- I'm trying to avoid any lab testing to save time. But if you get lab measurements, you can get an estimate of degradation under DC stress conditions. However, if your circuit works under AC (switching transistors or analog blocks etc.), the lifetime can be much longer because the real HCI degradation only occurs when the devices work in the linear region. To get the AC lifetime estimate, simulations with real circuit need to be run.

- The papers say that the HCI degradation is a function of the substrate leakage current for NMOS and gate leakage current for PMOS. Regular spectre simulations with current TSMC models give quite inaccurate estimate for substrate leakage and does not simulate gate leakage at all.

- I found that Spectre HCI degradation analysis only works with BSIM level 2, so it's useless

- The real solution is a tool called RelXpert offered by Cadence - this is a degradation simulator that requires additional model parameters for simulating leakage currents, aging etc. Those model parameters can be either extracted in the lab (by usinng Cadence BsimProPlus model extraction tool), or acquired from somewhere else. Right now I'm trying to find out if I can get those RelXpert models from TSMC or Cadence.

I'll email you some info later.
 

mosis relxpert

I am doing DC stress tests at the moment. Firstly I am going to do a normal operating 0.9 Vg / 1.8 Vd test over 24 hours to see if there is any IDS degredation. Then after that move onto hc testing. I am unsure at the mo exactly what bias to use on the drain. I am going to bias the gate at I sub max (approx 1.3v) and from Keithley docs they reccomend biasing the drain at 90% of the breakdown voltgae. I think from MOSIS files that this is about 3.0-3.2v. Working on this test conditions will be 1.3Vg / 2.9Vd.

Then going to take measurements every 1000 sec for 10000sec period.

I am also looking at GIDL in these tests - have you done any work like this???

As for AC testing then I cant really do that I think. I am doing though pulsed DC stress tests.
 

mosis tsmc 0.18um 1p6m models

Oh, now I understand what you are doing. Gate breakdown will only occur at around 5V, and S/D diffusion breakdown - at even higher voltages. So the only damage you will get by increasing the drain bias within the range up to 3-4V is from hot carriers. So just keep raising drain bias until you notice degradation. I never looked at GIDL, it's not a concern for me.
 

gate to drain bias + hci

Hi, just finished some test. Firslty biased the gate at 1.3 and drain at 2.8 and had about 0.4% deg in IDsat. So upped the drain to 3.0, but still no significant degredation. So next test I am going to up the drain to between 3.2 and 3.5. Will let you know how it goes
 

gate leakage current 0.18u

OK, I have the puzzle solved:
- Get RelXpert from Cadence
- Request the models from TSMC called "CL018 ACHC model.zip" - they are actual models for RelXpert with all degradation parameters needed. I have them but unfortunately can not send them to you being under NDA with TSMC. You should be able to get them yourself though.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top