The amazing progress of LEDs
vox.comOne interesting application for super-bright LED's is home theatre projectors. Until very recently such projectors used lamp modules (usually metal-halide) that typically have an operating lifespan of a few thousand hours at most. That's a very rough estimate. Failures can happen sooner and they occasionally implode. Also, brightness usually drops off gradually as lamps age. New lamp modules usually cost a few hundred dollars. These lamps also produce a large ammount of heat and require active cooling, which makes projectors noisy unless carefully designed for quiet operation. For these reasons, home theatre projectors probably remain more of a niche product than they might otherwise be.
LED based lamps are starting to show up in this market sector. Current LED-based projectors are mostly portable projectors that offer low brightness and poor image quality, but some home theatre models of decent quality are starting to appear. At present, they're expensive, less bright than most projectors based on traditional lamps, and still require fans for active cooling. However, as LED's become more power efficient and economical, these projectors will hopefully become brighter, passively cooled, and significantly cheaper.
Projectors are not appropriate in many environments, especially those with high ambient light levels, but LED's may help them make major inroads into the big-screen market.
Another interesting application: growing vegetables very efficiently (potentially in urban areas).
"Plants mainly need blue and red light for photosynthesis and far-red, a colour not even visible to the human eye but visible to the plant….."
This makes LEDs very interesting for this purpose - the ideal light spectrum can be achieved with LEDs. This will make more and more sense over time as the world urbanises further and LEDs get ever cheaper.
http://www.bigpictureagriculture.com/2011/02/plantlab-nether...
I can think of a large market for lighting suitable for growing vegetables in your home that doesn't produce a lot of waste heat...
It's already been done (LEDs for growing weed, impossible for cops to spot growhouses using FLIR helicopters and much lower power usage so spikes in energy consumption are far less dramatic).
I think in terms of impact, growing vegetables in a sustainable manner that uses little water will be critical for the long term sustainability of cities where water is scarce, such as Dubai or other cities in desert regions.
Some people in that market are already seeing harvests equal to 1gram per watt on LEDs. And the quality is as good as a much hotter high-pressure sodium bulb of higher wattage.
For more stationary projectors, couldn't you just make a larger LED array and focus the light through a lens, rather than try to do cram super bright, and presumably expensive LEDs into a regular bulb sized package?
Short answer: No.
Long answer: Typical projectors have two lenses. The one at the front is obvious, since it focusses the image of the LCD/film onto the wall. Just behind the LCD/film is a collimating lens (usually a Fresnel lens), which ensures that all the light produced by the lamp is focussed so that it passes through the first lens. Something like this:
The light brightness over the plane of the LCD/film must be very even, and this is achieved by having a lamp situated a fair distance behind it. The lamp itself must therefore be small in order for its focussed rays to fit through the aperture of the main lens. In fact, the lamp must be (usually) no larger than the aperture in the main lens.|Collimating lens ||LCD Lamp .-||-. |Main lens | .-' || '-. | .-' |.-' || '-.|.-' O--------||--------|------ '-. || .-'|'-. '-. || .-' '-. '-||-'It is conceivable that you could arrange an array of LEDs behind the LCD, but you would have to arrange that all their rays are focussed through the main lens. The prevents us from using a large array of LEDs in the same arrangement as above, as the light from the LEDs at the edge of the array would not pass through the main lens. An alternative would be to place the LEDs closer to the LCD, and give each LED a lens sufficient to focus its rays through the main lens. Something like this:
In this case, each section of the LCD would be lit by a different LED, so you would have to be very careful to keep the lighting brightness and colour even. LEDs vary in colour and brightness naturally.O.|LCD LEDs |-. |Main lens with | '-. | .-' lenses | '-.|.-' O-|--------|------ | .-'|'-. | .-' '-. |-' O'|Ah, well that's a shame then. In retrospect, if I have thought of it, the people whose actual job it is will almost certainly also have thought of it. But thanks for the excellent info and lovely ascii graphics. :)
Related: "Drowning in Light" from a few weeks ago: http://nautil.us/issue/11/light/drowning-in-light https://news.ycombinator.com/item?id=8344769
It makes the compelling argument that people are addicted to light. "Tsao calculates that, as a result, light represents a constant fraction of per capita gross domestic product (GDP) over time; the world has been spending 0.72 percent of its GDP for light for 300 years now. If there are other energy markets that show a constant percentage of GDP expenditure over time, Tsao doesn’t know of them."
Certain things are cheap in the rich world that are expensive in the 3rd world. Sewage, electricity, and water are so cheap in the 1st world, that we don't think about the cost.
The first time I saw 100% adoption of compact fluorescents was in Cambodia. They pay $.40/KWhr. That is insanely expensive (unless you live in Germany).
Lighting technology disproportionally benefits the poor rather than than the rich. Anyone that works on it is my hero. My family bought power for $.04 KW/hr, so I could study at night.
While Bangalore is not representative of India as a whole, the electricity prices are similar to global figures.
I'm genuinely curious whether new homes will start wiring lighting for DC to accommodate all these new LED bulbs? I just don't have a good sense whether its more efficient to transform AC to DC at a central hub in the home to distribute to all rooms, or more efficient to do it at the bulb itself.
The Emerge Alliance[1] is an industry group that has been working on this issue, though they've started with a spec for datacenters (as stephen_g alludes to). I think the consensus is that it is definitely more efficient to convert to DC at a single point, but one located as physically close to the loads as is practical, because of the line losses mentioned. Definitely something that will make sense for commercial buildings/campuses and multi-unit residential, and single family homes are not too far behind, I think. I can picture homes sometime soon just being wired for AC in the kitchen/garage/utility rooms with a LV DC grid elsewhere. The lower safety requirements for low-voltage really open up possibilities for system design and integration in the built environment.
Line losses through the cables are usually a killer with really low voltage DC (like 5V or 12V), so I think that AC would be more efficient there.
But some datacentres are using 48V DC now and then just piping that through a switching regulators in the servers instead of going from AC to DC. That could be workable.
48V DC has been in multi-client data centres (Telehouse, TeleCity etc) for years. It's a standard voltage in telecoms kit. The data centre I was involved in building out back in 1999 got this as standard.
Interestingly, I'm looking to do exactly this in a new home (new to me, not newly constructed).
I'll soon be moving to a park home (a relatively-permanent mobile home) that is in need of a fair amount of renovation.
I'll be running all lighting, computers and the TV off 12v DV circuits powered by bank of vehicle batteries and where the batteries are charged from 12v solar panels.
Converting voltages is trivial with AC, how do you handle devices that require something other than 12v (lots of phones want 5, lots of laptops want 18) with a DC system?
I'll be fitting some 12v-based USB wall sockets, much like the type you can fit in a car. These can then provide power for phones.
Regarding computers, I've so far only considered my desktop. You can get ready made DC to DC ATX power supplies quite easily [1].
Have you considered cabling loses? There is a reason power lines use very high voltages.
I've considered voltage drop over cable length only in as far as I'm aware that it occurs. I haven't tested the level of voltage drop.
I currently use 12v circuits in my camper van and haven't experienced any significant drop over cables about 3m in length. LED-based lighting and a 12v fridge operate as expected and the voltage at the point the power reaches these devices is very close to the voltage over the battery supplying the current.
I previously used 12v circuits in a narrowboat I used to own and didn't experience any drop over cables about 20m in length. This used halogen-based lighting and again the voltage at a lamp was very close to the voltage over the battery.
Just to be sure: You are aware that you have to measure the voltage drop under load? Even with rather thick wires, 20m should give you some noticable voltage drop with any non-minimal load.
What sort of plugs and connectors are you going to use for this?
Could you be a little more specific?
Sure. For example I imagine you'll have some kind of 12v receptacle in the wall where you could plug in a computer or a lamp of some kind. Is there a standard for this type of thing?
One problem with current popular LED bulbs:
They emit more blue light than incandescent bulbs [1]. If people light their households with LED lights at night, it might shift their circadian rhythms [2]. Screwed up circadian rhythms can have all sorts of negative health/productivity effects.
[1] http://www.designingwithleds.com/measuring-light-quality-phi... [2] https://justgetflux.com/research.html
I'll be trying my hand at some LED strip / arduino programming to build a dawn alarm that goes from dim red to bright white progressively. I have a Philips halogen dawn alarm which naturally goes from orange -> red as it ramps up, but unfortunately doesn't get quite as bright as I would like for that "wake up at the cabin" experience.
I'd love to see more applications of LEDs providing the right color temperature and intensity for the time of day, as well as more applications that avoid the bulb form factor. I'd love for my ceiling to emit light like the sky...
The best thing I've found so far are these lights meant for growing coral:
http://www.maxspect.com/index.php?option=com_content&view=ar...
One of the models puts out UV, so be careful.
Message me (email in profile) if you want to chat!
Those look like a very polished product! In my case, I'm looking to take advantage of the availability of $15 for a 300 RGB LED strip, as well as for the challenge writing my own control logic with an arduino. The SAD wake-up alarm is more of a side-benefit.
I don't have any SAD issues, but I find that getting proper light in the morning helps stop me from going to bed too late.
I like your project! How many watts is the LED strip?
At full intensity, it draws 6a @ 12v, so 72 watts if my mental arithmetic is correct. With luck that will throw enough light to be useful.
I'd been considering a project like this recently. I'd like some way of bringing the amount/timing/frequency of light I experience indoors closer to what I would experience were I in the outdoors, under the assumption that it would be better for my sleep/wake cycle and possibly other things.
That thought occurred to me as well. Circadian phase delay already seems to be a problem re: the increasing brightness of our computing devices, almost all using LEDs.
New phones, tablets, TV's, etc. should come with a warning to keep them "dimmed down" in the evening, or better, keep use to a minimum late in the day. Sure, I know, the odds are about zilch people would actually heed such advice, but it ought to be out there anyway.
Another thing is the the relatively low CRI. For many purposes (industrial, medical, artistic) the spectral output of bright LEDs is far less than ideal. LEDs will probably get closer to "full-spectrum" over time, and no doubt will easily beat the discontinuous spectrum of fluorescent lamps.
LEDs now just need to get cheap enough, and really need to work with existing dimmer controllers.
Yes, but there are solutions for brightness/color at least for our smartphones and PCs.
Android: https://play.google.com/store/apps/details?id=com.vito.lux&h...
CRI (or at least what's advertised) has really improved in the last year. It not difficult to find 90+ bulbs at places like Home Depot.
Almost all the LEDs in my house are ~2700K, about the same an incandescent lights.
Regarding circadian rhythm, light from near-sleep-time television watching is much more of a concern, but that's been a problem for at least 30 years.
2700K can still have a spike of blue in the wrong spot. this is a 2700K bulb: http://www.designingwithleds.com/wp-content/uploads/2014/01/...
what is the relevance of that blue spike? Incandescent spectrum looks much more linear (http://housecraft.ca/wp-content/uploads/2012/09/spectral_res...), but appears to maybe emit more (just judging by eye) aggregate blue energy than the warm white LED.
Then again intensity towards blue-green still seems much higher in the LED, and maybe that matters more?
I haven't managed to find an LED warm/soft bulb that actually has a spectrum like in the chart you linked.
The spectrum I linked above is from http://www.designingwithleds.com/measuring-light-quality-phi... -- it's for a Cree Soft White bulb. There's a bunch more area under the curve over the range that seems to matter for that bulb than for the incandescent one (although incandescent bulbs might not be great at night either...)
Green light does alter circadian rhythms as well, although not as strongly as blue (peak responsivity seems to be ~460-480nm). The actual circadian-wavelength-responsivity curve doesn't seem to be well mapped out, as far as I can tell.
I would suppose that intensity is as or more important than exact wavelength (within the violet-to-green range), given the increasing intensity of the rising sun is the origin of our circadian-rhythm adaptation.
I'd pay good money for a TV with integrated F.lux (i.e. blue gradually degrades with the movement of the sun).
I started using F.lux too many years ago now to count, and honestly it really seemed to help. However getting similar kind of functionality on all of your other devices (e.g. Tablets, Phones, TV, etc) is near impossible right now.
As far as I know the "blue light ruins sleep" research is fairly decent, so hopefully it isn't a placebo.
For Android I use something called "Twilight". It is a good stand in for F.lux, and even has a few extras. For time control, it has the standard sunrise/sunset coordinate system that F.lux uses, as well as Alarm control to set custom bedtime/wakeup.
Interesting link. The contrarian in me says it only looks at a couple bulbs, and in my (limited) experience with a variety of new low-end LED bulbs the light quality varies greatly and has also been improving, so maybe this won't end up being much of an issue on circadian rhythms at all. But it's an interesting factor I hadn't been considering (how do LED's compare in this to the CFL's that have already replaced incandescents for many, I wonder?)
Exponential growth is awesome! However, I was curious about the maximum theoretical lumens per watt to determine when the growth will fall flat. A current LED bulb gives about 60 lumens per watt. While an ideal monochromatic light source could give 683.[1] So it looks like there is _only_ one order of magnitude left in efficiency gains.
That 683 lumens per watt number is, as you say, for a monochromatic light. This is not useful for humans to see with[1]; the actual limit for human vision is about 300 lumens per watt[2]. That's an upper bound even ignoring losses, and some LEDs are around 70 now, so the remaining efficiency gains we can make seem limited to about 3x.
[1] http://www.cool.conservation-us.org/byorg/us-doe/color_quali...
He makes some dubious efficiency claims. It's true the colored LEDs have always been pretty efficient, but that's no use lighting your home. White LEDs have surpassed incandescent bulbs but still have a way to go to reach flourescent tubes and the grand-daddy of efficiency - gas discharge lamps.
One (the?) reason for low power levels is they generate so much heat and are difficult to cool. LEDs are still primarily heaters - useful in the winter.
I think you need to update your figures. High-end consumer luminaires have been coming in above 100 lm/watt total system efficiency for at least a year or so now, and with CRI superior to fluorescents.
Great news. I guess efficiency is following a similar rapid growth to power.
Several years ago I designed 1,500 W LED-based light source using extremely tightly packed LED's. Thermal management was a huge challenge. It took over three months of constantly running FEA thermal tests as well as physical tests to zero-in on an innovative approach to cooling the array. Crazy project. The surface of the emitter was measured at over 60,000 candelas per square meter. You simply could not look at the thing directly, it was really dangerous, almost like working with lasers.
I'm a bit disappointed they didn't include a legend on the giant chart they put in the middle of the page.
Why do I have to enable "whatsappsharing.com" in my NoScript filter to see the graph?
Because you're using a browser extension that breaks the web.