Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

least technology in analog/RF design

Status
Not open for further replies.

Chethan

Full Member level 3
Full Member level 3
Joined
Jul 11, 2005
Messages
171
Helped
20
Reputation
40
Reaction score
12
Trophy points
1,298
Location
Bangalore
Activity points
2,837
least technology

Hi all,
can someone tell me the least technology used in the industry today for analog/RF design. I had heard that 0.13um is the least for analog/RF design. Is it true?
has anyone done any design on analog/RF at technology lesser than 0.13um.
Thanx in advance
chethan
 

least technology topic

.18 is ok too
 

Big giants are working on 0.09µm (90nm) as well..and there have been quite a number of research papers appearing recently in this technology...However..as far as my information goes...it is not a mature technology as yet..!!

Regards
 

Intel had recently released its pentium chip in 65 nanometer technology,
Their next chip will be 45 nanaometer in the mid of the 2007
and in 2009 they r going to release a chip in 32 nanometer technology ,according to the moor's law.

But most of the other companies are working in 90 nanomete technology.

vijay
 

Posted the same in another thread...
65nm CMOS is being used in RF. Its still in the researchy phase. People are directly jumping from 0.13um to 65nm without doing much work on 90nm. The reason being that there is supply scaling,i.e, on reducing the feature size from 0.13um to 90nm, the supply voltage goes down from 1.8V to 1.2v but from 90nm to 65nm the supply stays at 1.2V. So better switches are seen at apparently no extra cost.
 

Is the flicker noise corner frequency at 65nm higher than the bandwidth for most communication standards? So between 10-50MHz depending on on device size.

Is the gm variability of the minimum length devices about 50%?

Is the threshold voltage shift after 1year and 70°C 10-20%?

Who models all these effects?

Who take care of these effects and take design counteraction?

Who spend much money in design and for a 1M$ mask set?

What is the advantage over 130nm/180nm?

Is there a 16bit 10GS/s converter to do the rest in 65nm digital?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top