why visible light not harmful when microwave is..even when it has high frequency

Status
Not open for further replies.

shirish heller

Newbie level 5
Joined
May 3, 2014
Messages
10
Helped
1
Reputation
2
Reaction score
1
Trophy points
3
Activity points
82
visible light has a frequency higher than microwaves ....yet microwaves are very harmful to human body
but visible light is not(it has high frequency and energy so it can penetrate human body more)
i cannot figure out the reason for this please HELP!!!!
 

Because microwave radiation causes water molecules to vibrate.
https://scitech.web.cern.ch/scitech/TopTech/01/MicroWaveOven/microwave_2.shtml

And as the human body is mostly water...you get cooked.

UV at low doses isn't so benign...skin cancer isn't all that good for a human body.

Visible light when collimated and with a high concentration of photons (i.e. LASER) isn't so good for the human body either.

The harmfulness boils down to how much energy you absorb (pun only slightly intended)
 
Last edited:

I think people are mostly confused and fear what they do not understand.
At the same power level, say 1 Watt, light can blind you, UV can burn your skin but 2.45 GHz microwave will barely be felt like warm by your hand.
Microwave ovens generate >500 W of power and this power is absorbed by water in food. Therefore heating and cooking is possible in a closed oven cavity. If you use (not recommended) the oven with open door, your eyes will be cooked fast, and your flesh, too.

If you use a 500 W lamp at home, it will blind you and can burn your skin, too.

Controlled microwave power is used in medicine to heat and destroy tumors, without leaving metastases. Power levels of up to 50 W are used to heat tumors to 42-43 deg.c, so they disintegrate fast, while healthy tissue survives.

People fear "microwave danger" from a 0.5 W cell phones while they let kids look to microwave oven when in full power (500 W).

So learn the details and be qualified!
 
the power flux reaching the earths top(via sun ,visible light) is 1400 watts/m^2 this is much more than 500W you are talking about so why are we not blinded then.......what is the power of visible light emitted by light bulb i guess its more than 1W at least....

concept is not yet clear please HelP
 
Last edited by a moderator:

It's all a mater of dispersed v.s. concentrated light energy...

Have you ever played with a magnifying glass?

Imagine a huge earth sized magnifying glass concentrating that 1400 watts/m^2 that strikes the entire earth and focus that into a spot the size of a basketball (makes for a good doomsday alien's attacking earth movie) basically anything in the focal point of that magnifying glass will be incinerated/melted. Fortunately there isn't a alien magnifying glass between us and the sun, so the light energy that reaches the upper levels of the atmosphere is scattered and absorbed.

If you were to stare directly at the sun you would blind yourself, why because the light gets focused onto your retina by the lens of your eye. I imagine you could probably injure yourself by using a 150 watt light bulb and a magnifying glass (I wouldn't know for sure without either a) calculating the energy concentration, b) trying it out experimentally). If you've ever accidentally looked at a bare 150 watt incandescent light bulb with no frosting on the glass you end up seeing spots for many minutes after doing so. Once again frosting the inside of the bulb diffuses the light from the filament.
 

Look (pun) at it this way: what is the aperture area of the pupil of your eye? How much of 1400W/m2 can pass through it? Still enough to cause harm but in relative terms quite a small amount.

Brian.
 

All depends on so called power density or flux. It is how much power is dissipated on certain surface. Sun light has flux of 1400W/m^2 or 1.4mW/mm^2 it means that in eye through pupil can pass max 4mW or less and for known reasons nobody looks at sun directly. Sun light focused through convex lens can burn. Through lens with diameter of 5cm passes about 0.2W but on very small focus area it has flux of 100W/mm^2. This is like sun without lens would have to deliver 100MW/m^2 for the same effect.
Mobile phone transmits through antenna max 5W of power in all directions. When placed on ear head is accepting half power on area of about lets say .O2m^2. That means flux is only 125W/m^2 or more than 10 times less than sun.
Discussions on what is more harmful light or microwaves belongs to other sciences.
 

Have you ever played with a magnifying glass?
Imagine a huge earth sized magnifying glass concentrating that 1400 watts/m^2 that strikes the entire earth and focus that into a spot the size of a basketball
Do not forget the resulting shadow outside of that basketball. Deadly ice-cold pretty fast.

I did in the past work several hours a week in front of an transmitting antenna, radiating 1 kW at 1 GHz. Sometimes did I even put the antenna in my pocket, as I was climbing around on EMC chambers to check for any RF-leakage with aid of this antenna.
After about 10 s. was it a strong burning feeling in my pocket so I had to climb fast. It was 20 years ago, and no harm,, yet.

A microwave oven emits as best 500-800W. Many electrical stoves can easily emit 2kW and radiation can be felt long away. Do anyone dare to stay in front of that radiation source and fry eggs for five minutes? Some do it all day long for living.
Doubt that it can be healthy.

Another kind of radiation, sunbed tanning, around 1 kW UV radiation placed close around your body for an hour. Offering that kind of sunbeds, can that be an sell-able business idea and expect customers that else are afraid of cellphone radiation?

Much is about intensity and time. A laser beam emitting power in mW range can create permanent harm if pointed against an eye. Especially if it is an invisible wavelength as eyes then not automatically will close.
Radio-waves are much harder to concentrate in a similar way as a concentrated laser beam.

Heating due to radio-waves are probably more harmful for real low frequencies, <10MHz , if available TX power is unlimited. It is due to that when heat is felt at skin, is almost same frying effect penetrating real deep in the body. Problem is that at such wavelengths will most energy just pass thru as our body is too short to become an effective absorbing antenna. Compare with GHz and upward, including visible light that blocks almost all energy at skin-depth.
It requires for example a strong flashlight to get any light at all thru a hand-palm.
A microwave oven is for same reason poor at penetrating most food. It can heat outside of a potato, but the potato can still remain cold in the middle, if not outer heating is given time to spread deeper.
 

With all Electromagnetic Energy <EM> there are properties for reflection coefficient & transmission losses.

From Plank's constant we know the Energy of photons increases with frequency. In comparison to UV and long-wavelength IR radiation, visible radiation is generally not strongly absorbed by the bulk area skin tissue, due to the reflection of photons on molecules near the size of visible light.

However, narrow bands of visible and more of UV wavelengths are strongly absorbed by certain components like pigments and blood.

The net result of backscattered and absorbed visible radiation determines skin color, the white of our eyes, which are sensitive to IR heat at extreme solar or heat levels causing cataracts. (caterogenic)

The max power level for Microwave cell phones is based on max Watts per gm of absorption of surrounding flesh.

During WWII, tank operators with their heads in front of the high power radio antenna would then go unconcsious and collapse until they eventually figured out the reason and moved the antenna away.

MRI/CAT scans use high peak powers,and LW RF frequencies but low burst duty cycles which are limited to safe average levels.

The clear distinction of damage is when energy absorption is high enough to excite molecules to raise electrons to higher orbits causing "2ndary electron emission" and alter the chemistry. Sub-ionization damage is possible, but brain cells are being replaced daily and who would know if some are not replaced or are gone unless extreme levels with spontaneous results.

Many fear the effects of microwave ovens but I have yet to see Journal papers showing any difference in protein damage between the same power absorbed at 1000x the frequency. ( Your IR Oven)

The protein decay or nutrient damage in food is from prolonged elevated temperatures not the resonance of microwave vs IR.

The EMI screen door on Microwave,Ovens reduces the field to below 5 milliwatts of microwave radiation per square centimeter.

I recall RF Techs in late 70's who got Red Eyes from tuning 100W transmitters into a dummy load due to PWB leakage so design was changed to have lid on with tuning holes for safety.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…