Test Microwave for Radiation Leakage
ismymicrowaveleaking.isotropic.usThis was one of my first app ideas about 10 years ago. Unfortunately my microwave had a mechanical timer such that if you opened the door when it was running the microwaving would stop, and on closing again it would resume. Unbeknown to me the door had been so opened, and on closing, with my HTC Desire inside, it duely resumed.
thats an awful feature
It is pretty convenient as long as you aren't putting cell phones into it.
The screening/Faraday cage will still work with the mains power disconnected, so one should ensure the mains plug is removed before such a test.
Note: I'm a past master at destroying old electronics in my microvave (this way I can't change my mind later and then not throw it out).
About three seconds is all it takes (just time for the magnetron filament to warm up) - any longer and the stuff starts to stink from the heat. There's no second chance if you make a mistake.
This seems like an odd way to destroy electronics. What's wrong with a hammer?
It's quick and easy and less dangerous (and more final) than a hammer!
I've been known to change my mind half way through destroying something when I think of another use for it - the microwave method ensures that the part of me that doesn't like throwing out electronic things that still work always gets thwarted.
The idea of microwaving electronics came to me a long while ago for very different reasons. In the early days of microcomputer control systems I used to work with Intel Multibus computer systems (these are so old you better check this Wiki out to know what I'm talking about: https://en.m.wikipedia.org/wiki/Multibus).
In those days Multibus boards were hellishly expensive and we used quite a lot of them. If one went faulty then we either fixed it at board level or if it failed within the warranty period then we returned it to the agent who then had it fixed (boards were always repaired - never thrown away).
Anyway, one board we received had an intermittent fault and I had it returned under warranty and the agent supposedly fixed it and returned it but it was still intermittent (you know, it had one of those annoying intermittents that was rare enough for one to think the problem was fixed when it wasn't). This toing and froing to the agent occurred at least three times and the problem still wasn't solved so I put an end to it by putting the board in the department's lunchroom microwave.
The plan was to damage a sufficient number of ICs to render the board beyond economic repair (BER) but not make the matter of microwaving it obvious to the supplier. This was very successful, a new board with a different serial number was supplied and the intermittent problem disappead.
To accomplish a truly defective board without obvious signs of user 'abuse' requires some care. The aim is to provide just sufficient microwave energy to puncture or short-circuit the junctions in the ICs but to not make it obvious (in practice, about one second of RF is needed to accomplish this).
Excessive RF will heat the silicon to the point where it boils the ICs' encapsulating compound and one ends up with little craters in the middle of the ICs which make it obvious the board's been zapped.
Most microwave ovens take about 3 seconds before any RF appears (no RF will appear until the magnetron cathode is hot enough to become emissive), so it's important to calibrate your oven before embarking on such an exercise.
I've used this technique very effectivly a number of times over the years. I must mention another incident long after the first, however this one had nothing to do with Multibus.
It involved a setup jig for programming certain electronic equipment and its motherboard was about as complex as a Multibus one. When motherboards failed I either fixed them or returned them to the manufacturer's repairer who was in a third country (he was renowned for fixing dead boards that people like me considered BER).
Again, we had an intermittent board that had crossed the oceans a number of times and it was still intermittent so I finally gave it the microwave treatment and returned it to him. Right, this silly bugger actually wasn't going to let it defeat him, he returned the board to me with a message to the effect 'this was a very problematic board, I had to replace about half the ICs [about 50 total per board], seems it was hit by a lightening surge'. Well, the board worked - sort of - but it had new and different intermittents (presumably, some of the ICs he'd not replaced were suffering breakdown puncture in their SiO2 insulation and thus were experiencing partial failures).
I returned the board again without any extra zaping and this time it was replaced with a different one (presumably, he'd either given up on it or wasn't prepared for my wrath when or if it failed again).
BTW, one can get pretty good at this if you calibrate your microwave carefully. Such zapping should never exceed nuking more than about 25% of the ICs. This is usually sufficient to deter most diehard repairers whilst not drawing their attention to the actual cause of the failure.
Fair enough! Personally, when I reuse broken electronics, it's often because I'm scavenging more mechanical parts like ports and switches. I think those would tend to survive either method, as long as you're not utterly destroying the device.
Right, scavenging for mechanical parts is also my main aim (LSIs etc are now so specific they're useless as spares). For example, only a week or so ago I had to replace a PCB-mounted USB connector on one of my PVRs and I scavenged its replacement from an old mobo. :-)
Neat idea, though it's not a definitive test. In my experience a microwave oven cavity provides about 40dB attenuation, and it's possible to maintain an LTE connection with that sort of path loss.
Hint: If you're ever doing radio work in the field and need an instant Faraday cage a microwave oven is a good candidate. Turn the power off first to reduce the risk of accidentally irradiating your device, but leave it plugged in to get an earth connection to the shield. (This is how I know an LTE connection can be maintained in a microwave oven.)
I'll confirm this works very well for 3G (600-900MHz) although I cut the hot off the plug out of paranoia when using one this way. Some of the 4&5G bands are up above 3GHz where it won't work so well.
Since your WiFi signal, however is almost exactly the same and there are many apps that will show and record signal in dB (as described in TFA). You could measure while the door open/closed if you're really careful not to turn it on. The lithium battery fire could respond very badly to water!
Depending on your region, LTE could be using much higher frequencies. Microwave ovens use a fixed frequency and the cage is designed for that. Considering this I would say it's still an effective test.
An alternative method: build a LEctenna.
https://www.nrl.navy.mil/STEM/LEctenna-Challenge/
I stumbled across this little project a few weeks back, ordered the parts (just a diode and an led), and it works. Put a bowl of water in the microwave (or dinner), turn it on, then wave the lectenna around the cracks and see where it lights up.
I originally found the lectenna by researching if it was possible to power an LED wirelessly by leeching power from a house 60Hz line. I haven't made any progress on that, so if you have ideas I'd love to hear them.
If you run a low voltage 24V DC power cable for lighting next to a 220V AC line (I'm in Europe, no idea if 120V does the same) it's quite easy to get a situation where there is enough power for LED lights to be lit even when switched off. This is why low voltage wiring (lighting, ethernet, hdmi, etc) should not be ran parallel to AC power cables.
Oscilloscopes often pick up some small voltage at 60Hz when the probe isn't connected to anything, and that voltage increases when a human touches the probe by acting as an antenna. But this is because of the scope's high input impedance(1-10MΩ); as soon as you try to power something off that, the voltage just collapse to zero.
Idk if OP is the person who wrote this, but on my device the red text about not turning the microwave on is not especially legible. I’d probably make that text a bit bigger and/or brighter. I expect most HN readers will know not to microwave a phone but you’d be surprised what people might do.
Indeed, it's generally not considered legible on any common devices: WCAG suggests to ensure suitable perceived contrast ratios [1], and there are checkers around [2], while this page's #f00 on #36393F is far from meeting those.
For the safety's sake, might be nice to also recommend to unplug the microwave oven altogether as the first step.
[1] https://www.w3.org/TR/UNDERSTANDING-WCAG20/visual-audio-cont...
In dark mode it’s very legible. I expect most people know not to microwave their phones except maybe children.
> I expect most people know not to microwave their phones
I think you're vastly underestimating how little people know. A friendly FYI: seems you're in some kind of a bubble.
Personally I didn't learn about don't-microwave-metal (incl phones), until I was about 20 -- and at that time, I had already been studying physics at university for a year, and knew vastly much more than what most people ever will. I wouldn't say I was a child, at that time.
Be happy if people in general realize that photons and coronavirus aren't the same thing, although they're both small.
I wouldn't count on that: many things that seem straightforward, especially to an adult tech-savvy person if those things are tech-related, are far from that for others. And even tech-savvy people can be (and often are) careless, distracted/preoccupied/unfocused, making mistakes; I think pressing a "start" button automatically upon closing the door is something that can happen easily. Also as a rule, extra guards, safety measures, and/or warnings are good, especially when irreversible damage can happen.
Edit: actually there's an example of an unexpected mistake in this thread already, in the comment by hanoz. As mentioned before, listing unplugging as the first step would be useful.
Actually, this is not as stupid as it seems. I was first baffled by the idea that a website can read the RSSI of the device I'm on, then I realized its probably just measuring the latency by pinging every second.
I remain in awe that we trust the very cheapest plastic under repeat load and thermal cycling to form a -40dB seal, turning 700W into 70mW. Also, it seems that the outer metal shell of the device forms an active part of the circuit, so if it isn't plugged into a grounded outlet it sits at lethal potential. Then there is beryllium oxide in the thing...
I kind of doubt an independent inventor could bring this to market with today's startup climate.
> "I kind of doubt an independent inventor could bring this to market with today's startup climate."
Especially the kind of inventor who created microwaves for experiments with reanimating frozen hamsters, cough James Lovelock.
(Tom Scott's video "I promise this story about microwaves is interesting" which includes a brief interview with James Lovelock last year at age 101 - https://www.youtube.com/watch?v=2tdiKTSdE9Y )
A faraday cage does not require grounding to work, but its still recommanded to properly ground microwaves.
Also, unless your electrician had a catastrophic fuck up, the metal cage will never be at live voltage, with or without grounding.
The specific scenario I have in mind is a student living on their own for the first time, cooking with a microwave in their room. Many catastrophic fuckups that we were quick to forget took place during this phase in life.
You might take apart a microwave oven and discover that there are actually several pieces of sheet metal in them. Expensive ones connect them by staked wires. The capacitor is usually rated for 2000 volts and it packs quite a punch.
How does a student living on their own has to do with anything about microwaves having "lethal potentials"? And what does the high-voltage capacitor has to do with sheet metals?
I'm sorry, but I really cannot follow your logic here.
Have you ever looked inside a very dirty computer? It's quite simple.
> I remain in awe that we trust the very cheapest plastic under repeat load and thermal cycling to form a -40dB seal, turning 700W into 70mW
It's the metal grid in the window (with holes smaller than the wavelength of the microwaves), and the metal shell of the cavity, not any "plastic." The same reason the metal grid works is why there doesn't need to be a perfect door seal. As long as as the gap is smaller than the wavelength of the microwaves, it's fine.
> Also, it seems that the outer metal shell of the device forms an active part of the circuit, so if it isn't plugged into a grounded outlet it sits at lethal potential.
The shell doesn't sink RF, it reflects it. GFCI outlets (required in many areas for kitchen outlets) trip at 5mA differential between hot and neutral. No appliance is designed to sink current into ground unless there's an electrical fault.
> Then there is beryllium oxide in the thing...
Beryllium oxide hasn't been used in microwaves for a long time, and it presents zero risk unless the magnetron is smashed.
Recommended reading for you:
https://www.dannyguo.com/blog/my-seatbelt-rule-for-judgment/
https://www.psychologytoday.com/us/basics/dunning-kruger-eff...
Edit: They can interfere with WiFi because a microwave could leak a tenth of a percent of its nameplate power and it would overpower your access point by anywhere from 1x to 10x. Access points can, on certain bands, have radios up to ~1W, but 125-250mW is much more common.
It would also be completely harmless even if you were standing inches away from whatever the source of the leak was. Microwave RF energy only becomes dangerous when it is strong enough to heat up parts of your body that cannot cool themselves quickly due to having little/no bloodflow, like your eyes.
You could put a parabolic antenna on your home wifi AP and standing in that beam would expose you to more RF energy than your microwave.
I don't know why HN suddenly has a "DANGERS OF MICROWAVE OVENS!" boner this week...this is I think at least the second article on the subject of the 'dangers' of microwave ovens.
Regarding "the door gap is a long line" - that would be relevant if the beam were aimed parallel (or close to parallel) with the gap...
Am I wrong in ignorant understanding that if some radiation leaked, its' effect would largely be about heating affected body parts, which presumably you'd feel as, you know, heat; as opposed to some magical or teryfing "thing"?
In other words, why is microwave leaking worse than e. G. My oven leaking heat? Is potential small amount of microwave radiation In some specific way worse than feeling the heat when you open the oven after baking? I know radiation is a scary term but what is the real actual scientific documented risk here?
I know some pregnant families paid more money for microwave tests than their microwave costs and it never felt legit but I could be wrong.
IR radiation is stopped very easily, eg by dead skin cells. Higher energy radiation, eg microwaves, can penetrate much deeper into your body and start causing damage to living tissues.
On the contrary, IR penetrates past the dermis. Also, IR is much much higher energy than microwaves although both are non-ionizing radiation and will only result in heating. The amount of heating you get from the sun spread over a square meter is close enough to the power output of a microwave. Even with the lower attenuation to 2.4GHz vs 240THz the peak heating is still occurring on the surface. You'll probably notice if you're being heated with a couple hundred watts of power. If you don't notice then you're absorbing so little power that there's no chance of any injury. You'd need to raise the temperature of your organs by at least 10 degrees to cause any organ damage and your skin temperature would be so high by that point that you'd already have burns and any sane person will instinctively move away from the heat source causing them pain.
It would take far too long to heat up hot enough to cause more than skin burns and nobody would ever stand there long enough to even get a skin burn to begin with. Any sink presents a much more serious risk of injury than a broken microwave. A hair dryer is more dangerous.
It seems like you might be extrapolating the speed of heating some piece of food and assuming it could possibly heat someone outside of the microwave at the same rate. It can't. The only reason small things heat up quickly is because in a closed microwave the walls reflect the microwave energy many many times before it's eventually absorbed by the food. It's a high Q factor resonant chamber. Effectively when there's little energy being absorbed or escaping the intensity of the microwave radiation is multiplied many times over until the energy being absorbed is equal to the energy being put in. If the door is removed it's just going to bounce out and you get none of the massive jump in intensity that you get with the door closed and only a small 1/2 lb of food inside.
I'd suggest looking at the section on Adults and Microwave ovens. Over many cases of people being exposed to an open and operating microwave oven the only injuries were burns close to the surface and peripheral neuropathy from placing their hands inside the oven getting a substantial portion of the total power on a small area. The more serious injuries on that page were a result of much more powerful and more intense microwave sources at different frequency bands. If you stick your head into a high powered waveguide, you're going to have a bad time.
In fairness I was surprised to learn that nerve cells absorb a fair bit more than skin and fat leading to quicker nerve damage. All things considered though, I stand by my statements, hot water at a sink presents a much bigger risk given the frequency of exposure and the risk of burns.
> The same reason the metal grid works is why there doesn't need to be a perfect door seal. As long as as the gap is smaller than the wavelength of the microwaves, it's fine.
The issue is the gap in a door normally forms a long line.
The fact they interfere with WiFi should make it obvious the average microwaves faraday cage is far from perfect.
The point of a microwave to have a faraday cage is not for preventing interference with wifi; the cage is there for preventing the microwave microwaving the user. At an average power of 1000W, even a thousand-fold attenuation (-30dB) means 1 watt/30dBm leaks out, minuscule for humans but enough to saturate typical wifi receivers.
The cage was originally there to make them more efficient, spraying microwaves into the room would mean you food takes a lot longer to cook.
Anyway, I don’t recall all the details but 1W from a microwave is probably an underestimate.
WiFi uses multiple separate frequencies and outside of what a microwave should be producing. So it’s significantly more energy to block it than you might think especially when the hub is closer to the device than the microwave.
PS: To be clear this is still a trivial amount of energy, just annoying when reheating food blocks WiFi.
> A Federal standard (21 CFR 1030.10) limits the amount of microwaves that can leak from an oven throughout its lifetime to 5 milliwatts (mW) of microwave radiation per square centimeter at approximately 2 inches from the oven surface.
So you were off by a factor of 200, and you can see that my referencing -40dB of attenuation was an underestimate. I do wonder what details it was that you thought you might recall.
Quite disappointed by HN standards of self-moderation yet again.
You’re way off, that’s 5 milliwatts per square cm times the surface area.
A 1 foot cube, has 6 faces of ~30 cm * 30 cm or 5,400 cm2. 5,400cm2 * 5mw = 27 watts. Of course 2 inches from the surface is a significantly larger box.
Of course that’s a legal maximum, most devices should be well below it.
You can melt stuff with 27W. Doesn't pass the smell test.
Sunlight is 93w/square foot, this would be under 4.5w/square foot good luck melting something at room temperature with that. I think you have a poor intuition about the difference between point sources and energy across an area.
Anyway, math is math but presumably this is why that’s the legal limit. It’s low enough to be safe while high enough not to be expensive for manufactures.
You would need to take your access point and wave it all around various directions in the space around the microwave. The “leak” could occur in a direction that doesn’t have significant signal. Might be a better test to cook a large bowl of water, while testing your phone (outside the oven) on 2.4ghz … holding it on all sides of the oven to see if any areas degrade the signal. This testing approach isn’t that conclusive.
>This testing approach isn’t that conclusive.
Agreed. I just tested with two phones and one phone timed out but the other was able to maintain a connection. That would suggest that my microwave is maybe leaking. However I'm able to use the microwave without any noticeable effects from on 2.4ghz devices.
A live test would be quite dangerous if the microwave does leak EM.
Neat idea!
I went ahead I tried it with a 5GHz connection (the site was practically begging me to) and it turns out my microwave blocks both 2.4 and 5 GHz signals. Pretty cool! If your wondering, it's a decade old Sunbeam that I bought for $30 so nothing special.
Brilliant. The best part of this was running into the kitchen while my girlfriend was washing dishes and watching her face as I put my phone in the microwave ;)
Or, just use an Ubertooth sniffer. From anywhere near a running microwave you should see it in the 2.4gh range. Its cool to see how some microwaves vary the frequency up and down during a cycle. It looks like a standing wave bouncing back and forth. This also explains the common office phenomena of the wifi dropping every time someone nukes something.
https://youtu.be/DCYrrNQc3lM https://youtu.be/6N3P842Nay8
Whether your phone can connect from inside is not a great standard. Your phone's antenna is maybe 1/100,000 the power of a microwave oven magnetron.
Ignoring impedance and assuming the microwave oven is rated at 1kW. 1kilowatt = 60dBm. A typical WiFi receivers works fine at -60dBm = 1 nanowatt. 60-(-60)=120dB, or 12 orders of magnitude. To put that into persepective, a WiFi receiver work with 1/1,000,000,000,000 the power of a 1kW microwave oven.
I needed to test behavior of an app on an actual iphone with flaky cellular and thought a microwave would be a great place to simulate this. It wasn't. The cellular connection was unaffected. Wrapping the phone in aluminum foil killed the signal enough though.
with your phone outside , graph rssi and noise . turn on the microwave you will see it drop
With the phone on the inside and the microwave on, the RSSI will drop quite rapidly as well.
My first thought was that it would try something like this. For a given sensitivity it's much easier to detect the 700W signal than the 2W one.
> 1. Put phone on 2.4ghz wifi (5GHZ WILL NOT WORK!)
Might be nice to expand on "will not work". Wouldn't 5 GHz Wi-Fi failing to connect show that it's even better at blocking, and would easily block 2.45 GHz too? And I'd think that they should block 5 GHz too, since those meshes look quite fine, and they probably try to be extra-safe.
5GHz actually has worse penetration and stronger attenuation over distance than 2.4GHz. Succesfully blocking 5GHz does not imply the same for 2.4GHz.
However I do agree that it's probably still gonna work because the faraday cages on microwaves are always overkill (even the cheap ones).
> 5GHz actually has worse penetration and stronger attenuation over distance than 2.4GHz. Succesfully blocking 5GHz does not imply the same for 2.4GHz.
In this case it could be the opposite: faraday cages only work for blocking wavelengths that are longer than the wavelength that it's designed for (presumably 2.4ghz). Therefore it could be blocking 2.4ghz but letting 5ghz waves through because it's too small to contain.
As long as the hole size in the faraday cage is much smaller than the wavelength, it will work. The front mesh of a microwave typically has openings on the order of millimeters, which is still good enough for 5GHz.
a cheap consumer RF meter detects "high" levels of RF from 5+ feet away, at least on the 4 or 5 random units I've tried. So the faraday cage doesn't seem to be doing its job in full.
A "high" from a random RF meter doesn't mean anything. Give me numbers in dBm/MHz.
RF leaks can be pretty directional. So simple tests like this phone test or waving a power meter around aren't terribly sensitive. (But pay attention if you get a test failure!)
There really isn't any substitute for a proper EMC chamber.
My bluetooth speaker always cuts out when nuker is on, but never notice any issues with wifi - wonder why.
Because Bluetooth uses 2.4 GHz, and your wifi is on 5GHz.
Is this is an IQ test?
Yes. If you are able to follow instructions you will not turn the oven on with the phone inside.
unless it's a recent generation iphone w wireless rapid charging
It says in red at the top to not turn on your microwave. This seems to be testing if WiFi waves can get through the microwave.
It’s interesting the browser can get access to this information with no prompts
All it does is measures ping. No direct WiFi measurements just the timing of regular network requests.
> It’s interesting the browser can get access to this information with no prompts
Exactly what information is it getting ? What Chrome API is this using ?
This thing is totally insane. Instructions are not clear at all and the risk that someone bakes a phone in a MWO or doesn't any other harmful thing is rather high. Please remove this post.
I strongly disagree. If you are idiot enough to microwave your own phone, that's on you. This post is fine, and interesting.
Calling people idiots for making mistakes is about the most boring thing I can imagine. I agree it shouldn’t be deleted, and that it’s interesting. But everyone flubs it from time to time, even people who like to think they’re above that.
There is a difference between making an understandable mistake and doing something that should make you ashamed for being that dumb.
Nuking your phone in the microwave is in the same category as baking it in the oven. You really shouldn't be surprised when you don't have a working phone anymore.
I don't think it is wrong of us to expect better from a bunch of adults. They should know better.
Look. Calling people idiots or stupid is just a shitty thing to do. Shaming people for not having the same information, recall, thought process as you is never going to help them better meet your expectations of their mental faculties. It might even have the opposite impact to some degree, depending on how they react to the insult.
It doesn’t matter one bit whether you think someone should think it through the way you would. People routinely disappoint on that front, and it’s almost always because people have expectations that are closer to their own experience and thought process than they realize.
Personal anecdote, for your judgy amusement: a couple weeks ago I fell hard on the pavement of a very busy arterial road. I was running behind schedule, hurriedly looking for a chance to beat the crosswalk signals if there was a safe gap in traffic. I was paying attention to the height of the curb as I went, because it’s unusually high off the road at one intersection then gradually gets closer to a normal height in the course of a block. Trouble is, I wasn’t paying close enough attention, and I misjudged the height by about a foot.
I stepped into the road in a gap in traffic, but there wasn’t anything under my foot when I stepped down. Before I know it, I was sprawled out on the street, gratefully with a busted up hand not a busted up head. And more gratefully, with an audience sitting at the next red light, not still traveling towards where I fell.
I was, indeed, embarrassed by all of this. And any one of the people who saw it happen surely could have called me stupid. Maybe they even did, but I’m happy they kept it to themselves at the time. I realized my error as soon as it happened. What good would have come of insulting me on top of that?
Honestly, I don't think your anecdote helps the point you are making. Yes well all do stupid things from time to time. I have done stupid things. I'm arguing that there are times when we should should show kindness and compassion and there are times when, fuck it, you were an idiot, you should know better, and I am going to judge you for it.
There is an idea floating around that all judging is bad. I reject that. It might not be helpful to the person being judged an idiot, but the expectation that I should change my view of others to help them with their learning is... I'm struggling to find the right words here... an imposition?
I think it is the difference between being kind to everyone and not being unkind to anyone.
Everyone judges people all the time, to claim otherwise is a lie. I will continue to judge people by their actions, whether or not they improve by some metric is not my aim, nor my problem.
Agree that the warning:
might not be visible or clear enough. I think OP should consider updating the instructions and set as a first step:note: do not turn your microwave on for ANY portion of this test1. Unplug the microwave.>Unplug the microwave.
Pretty sure you need the earth though right?
I keep hearing that for Faraday cages but its not clear to me why.
Why should it be grounded?
The cell signal in the Faraday cage causes a mirror signal on the cages inner surface and the cage’s conductivity ensures that the potential is zero elsewhere in the cage.
Why do I need the ground? Is it a consequence of not having a perfect conductor?
The case itself becomes an antenna if it's not and just re-resonates the signal.
Does it though? At what frequency? In the limit of f=DC this is trivially not true.
Also, if that were the case, an airplane couldn't be made into a Faraday cage.
Also, the cage+Earth system would itself be an antenna.
If you need ground for a topologically imperfect cage, maybe… but I doubt it.
I think (but I don't know!) this is like the legends surrounding engine torque car guys believe. Electrical engineers putting everything to ground because they got burned
So hard not to automatically press start.
But one could also use fridge for this test.
I don't know, I'm not sure the fridge would give a good indication of the microwave's leakiness.
It might, if someone put their router in the microwave, simultaneously with the phone in the fridge... but at best it would prove that both the microwave and the fridge have leakiness.
Neither does TCP over WiFi, really.
In most cases, the fridge won't fit.
You need a basic level of competence to be allowed in the kitchen. - That is where most domestic accidents happen.
Funny how kitchen stuff was thought to be women's work once.
>Funny how kitchen stuff was thought to be women's work once.
Clarification: domestic kitchen stuff was thought to be women's work, professional kitchen stuff was thought to be men's work (and still is[1]).
That article doesn't seem to talk about safety at all.
I have read this several times over and have no idea how you could mean it which isn’t sexist af.
Edit: okay maybe I can imagine you’re saying that women have been competent in the kitchen and men [/other gender identifying people?] have demonstrated not being as competent. I’m really trying to stretch credulity to read this in good faith.
Women are widely known to be more attentive and dexterous in e.g. electronics assembly. Surely this translates to the domestic kitchen as well.
Sexist? Please, shove that word. This is simple facts about life.
> Sexist? Please, shove that word.
Respectfully, fuck no.
Too much call of the void in this tool. The phone is in the microwave. I could push start. All I need to do is push start. Just...zap. Poof. Sizzle.
related question... I have a bluetooth headset (aftershokz aeropex), when my microwave is running the audio from my phone seems to get interrupted. think it also happens if just the phone is near the microwave. should I be concerned ?
Its typical.
What does this chart indicate?
You load the web page on your phone, put it in to the microwave and close the door. The microwave should be a Faraday cage preventing microwave radiation getting through. Now the phone/web page cannot contact the internet. The chart stops updating.
Unless there's a leak, then the chart continues updating while the phone is in the microwave.
Then open the door and look if it could/couldn't keep pinging the server while the door was closed.
I did this and now my phone has an extra 4gb of RAM
Mine has 5G now!
This doesn't answer GP's question. How does it work? Why does my workstation connected to my router via ethernet cable show wildly varying results? What is the unit of Y axis?
Seconds (judging by the source code), and I assume the variance comes from the time it takes the server to respond + javascript delays. It's interesting to see the significant variation and an almost 100 ms difference between what my browser reports and what the chart reports. (The server responses were pretty consistent).
I think it tests latency of wifi connection. If wifi can not leak into the oven, microwaves can not leak out.
Alternatively, you could buy a microwave leak detector for $20-30.
So should I be concerned if the phone seemed to keep pinging just fine inside the Microwave? I also notice my Bluetooth headset break up if I’m near the microwave while it’s running.
Isn't it lovely how we're inundating our homes with radiation at wavelengths that can cook meat? It's as if we _want_ the robot revolution to succeed.
You mean like infra-red from the Sun when we have homes with windows?
(A microwave is ~1 KiloWatt up close, WiFi is ~1Watt and meters away. This is like spreading fear that your house has a warm radiator which is bad because ovens use warmth to cook food).
Not only is there a massive difference in the power of a microwave oven compared to wifi, but the mechanism through which it heats food is dielectric heating [0] which does not occur in radio signals because they are not rotating fields.
This basic information is contained in the first very short paragraph of how a microwave oven works on wikipedia.
I hear these concerns in real life and as per my longer post above, I don't understand the threat model. My oven and stove can also cook meat. It's what they do. What is the actual concern? I feel I have burned myself on ovens and stoves a lot more than I have on microwaves in my life (let alone the risks in popular gas stoves etc). I am genuinely curious what is the delta and differentiating factor making microwaved mystically scary?
The only adverse effect microwaves have been found to have on humans is due to heating [1]. WiFi transmitters emit less than 1/1000 the power that a microwave oven does. Unless your cell phone starts to make you feel uncomfortably warm, I don't think you have to worry.
[1] https://en.wikipedia.org/wiki/Microwave#Effects_on_health