Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Open loop settings??

Status
Not open for further replies.

pekachoo007

Member level 2
Member level 2
Joined
Jul 29, 2013
Messages
45
Helped
1
Reputation
2
Reaction score
1
Trophy points
8
Visit site
Activity points
284
EA.JPG

Is this setting valid for finding the Open loop gain of the opamp.

If yes then hows its done in cadence spectre.
 

There must be no OP capacitive output load, remove one capacitor.

Use high L and C values, e.g. 1000 H / 1000F or higher, depending on the frequency range of interest.
 

There must be no OP capacitive output load, remove one capacitor.

The opamp is actually an error amplifier part of a linear voltage regulator. The output capacitance represents the gate capacitance of pass transistor.

Use high L and C values, e.g. 1000 H / 1000F or higher, depending on the frequency range of interest.

Using 10^12H and 10^12F for L and C values for the frequency of operation of 10MHz.
 

A small question. Why do you need inductance in feed back? Can't you directly connect the -ve terminal of the opamp to the required DC potential and measure the open loop gain? Correct me if I am wrong.
 

The opamp is actually an error amplifier part of a linear voltage regulator. The output capacitance represents the gate capacitance of pass transistor.
^
O.K., in this case the regular output C should be used.
 

What settings should I give to Cadence Spectre for simulation results of gain and phase margin
 

Run an ac simulation for your frequency range of interest, then display gain (dB20 scale) and phase.

- - - Updated - - -

Why do you need inductance in feed back? Can't you directly connect the -ve terminal of the opamp to the required DC potential and measure the open loop gain?

Because you need the DC control loop.

- - - Updated - - -

Using 10^12H and 10^12F for L and C values for the frequency of operation of 10MHz.

Calculate its cutoff frequency resp. time constant! DC control will take much too long for your simulation period: don't forget the output impedance of the opAmp.
 

Run an ac simulation for your frequency range of interest, then display gain (dB20 scale) and phase.

I have done this but not satisfied with results.


Calculate its cutoff frequency resp. time constant! DC control will take much too long for your simulation period: don't forget the output impedance of the opAmp.

What would be use of these extra calculation, how they would be helpful. As you suggested, could you provide the formulas also.
 

eares.PNG

Set AC source with AC Magnitude=1 V, DC Voltage=1.1 V, f=10 MHz

db20 = 44 db
What would be phase Margin, is it 180-67= 113??
 

What would be use of these extra calculation, how they would be helpful. As you suggested, could you provide the formulas also.

Just use the known or estimated value of the opAmp's DC output resistance R and calculate the cutoff frequency of this low-pass filter RLC time constant; its inverse is the time constant of your DC input control voltage, i.e. before this time you can't expect to get the correct input bias.

At this cutoff frequency you can neglect the value of your chosen L=10^12H impedance against the opAmp's DC output resistance R value - let's say this is 100Ω - so the cutoff frequency - considering your chosen C=10^12F value - fco = 1/(2πRC) =0.16*10^-14 Hz, its time constant is τ = 2πRC = 6.28*10^14 s ≈ 20 million years. After this time you can expect that the input bias has reached about 63% of its correct value, more than 99% after 5τ ≈ 100 million years.
 
before this time you can't expect to get the correct input bias
SPICE achieves a correct bias point in "no time" because it disables all Ls and Cs during initial transient solution.
 
  • Like
Reactions: erikl

    erikl

    Points: 2
    Helpful Answer Positive Rating
pekachoo007, is the max. gain of 40 dB the value you have expected?
Why a dc voltage of 1.1 V ?
 

At this cutoff frequency you can neglect the value of your chosen L=10^12H impedance against the opAmp's DC output resistance R value - let's say this is 100Ω - so the cutoff frequency - considering your chosen C=10^12F value - fco = 1/(2πRC) =0.16*10^-14 Hz, its time constant is τ = 2πRC = 6.28*10^14 s ≈ 20 million years. After this time you can expect that the input bias has reached about 63% of its correct value, more than 99% after 5τ ≈ 100 million years.

Why to fret about million years, the values are for the test bench.

- - - Updated - - -

pekachoo007, is the max. gain of 40 dB the value you have expected?
Around 70.

Why a dc voltage of 1.1 V ?

What it should be then.
 

The purpose might be certain, with the certainity principle lying somewhere.
So - you dont know why?
This is not a big problem - however, in this case you simply should ask instead of (blindly) using a dc component without knowing its purpose.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top