Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Oh right, about those LEDs (xiphmont.livejournal.com)
407 points by BuuQu9hu on Nov 30, 2016 | hide | past | favorite | 187 comments


> "What do you mean? You can have any bin mentioned on the spec sheet."

Reminds me of the ancient story of the electronics manufacturer that sourced some resistors or whatever from Japan for the first time. The spec called for max 1% bad parts or whatever. When the parts arrived, the box contained a packet with a note that said something like "Thanks for your order. We are unsure why you want 1% defective parts, but for your convenience, we have packaged them separately."


"An IBM plant in Windsor, Ontario, is said to have ordered a shipment of components from a Japanese firm, specifying an acceptable quality level(AQL) of three defective components per 10,000 shipped. In a covering letter accompanying the shipment, the Japanese company apologized and said it had met with great difficulty producing these defective parts, and had been unable to understand why they were required. They wrote: “We Japanese have hard time understanding North American business practices, but the three defective parts per 10,000 have been included and are wrapped separately. Hope this pleases.”

This seems to be the oldest reference: Chris Taylor, (1995) "The case for customer satisfaction", Managing Service Quality: An International Journal, Vol. 5 Iss: 1, pp.11 - 14 http://www.emeraldinsight.com/doi/abs/10.1108/09604529510081... but behind a paywall

...it is referred to for example by http://link.springer.com/chapter/10.1007/978-981-287-429-0_3... "The Effectiveness of Service Quality by Jabatan Agama Islam Wilayah Persekutuan (JAWI) Towards Customer Satisfaction"


Funny, I heard it as AOL buying modems. Instead of a complaint, they got a shipment in two parts with a note: "We don't know why you wanted defective parts, but for your convenience, we packaged them separately."


This is exactly my experience. Cheap Chinese LEDs seem like a bargain until you discover they are not spec'd beyond "they emit light". The steep price you pay for CREE and other brand-name LEDs is for a product that actually lives up to its datasheet.


I have gotten some good Chinese filament LED bulbs from eBay, but it is very hit or miss. I did not analyze them for the quality of the output beyond "it looks good to me". The total cost is lower than CREE and other brands even if I consider that I had to throw away half of them due to a poorly manufactured batch that had an audible electric whine and failures of the higher output models after a month or two.

If you are willing to buy multiple times what you need to make up for the quality issues, you can get a decent deal with the cheap Chinese ones. The ones that whined and the ones that did not whine had a fabric covering the wires at the base of the bulbs. If I had some way of knowing which eBay merchants were selling those, I would get a much better deal.

By the way, Nichia is a Japanese company.


Buy cheap, buy twice. Even if you get a good one, no-brand Chinese bulbs just don't last. They're badly engineered and use cast-off LED chips that nobody else wants. I made the mistake of buying some Chinese LED lamps for my utility rooms and outbuildings; they all died within 18 months.

Philips and Osram LED lamps just aren't that expensive. Cheaping out just isn't worth the hassle.


"Buy-twice" methods like this further convince me that a stronger incentive is needed to ensure such junk is not manufactured but is integrated into what many are calling a "circular economy". I wish it was the responsibility of the manufacturers to take back all failed and end-of-life products and recycle the materials such that their waste output was minimal. Likewise, the I wish the wholesaler and retailers were responsible to take back the product from the purchaser or consumer and send it back upstream (and not drop it in the dumpster like Costco does).


I wish there was some drop dead easy way for this. Like you just hand these items to the postman or the like. Dead batteries, outdated electronics, broken electronics, etc.


They are around the 12 month mark now. I will know if they make the 18 month mark 6 months from now.


Buying a lot of crap and doing your own QA has some horrible negative externalities... This sort of throwaway economy is the last thing we need.


This is opposite to the linked post.

The post stated they received LEDs in excess of expectation for quality and low cost for small production run size.

They got the quality and more.

If interested, ask how.


Nichia is a Japanese company.


Trivia fact:

Nichia is not only a Japanese company, but they are the inventors of the now ubiquitous GaN blue-LEDs; or at least Shuji Nakamura was working there when he invented them.


Quality vs Throughput right here.


I had the experience dealing with some Japaneses companies when they tested the chips we developed. Their level of testing surpassed what most of the companies in the US here do when they qualify a part. They don't take your word on the specs and do their own testing when possible.


Watch Gung Ho for a mostly-accurate comedy relating to this.


The article is about him calling someone at Nichia USA[1] to acquire some LEDs and complaining about all the cheap crap flooding ebay from China. The comment above is perfectly appropriate.

[1] Nichia is a Japanese company with US offices.


The company the author refers to in the article is not Chinese.


True of most of the Chinese 'manufacturers' (usually just encapsulators) I've bought from. Hoewever, there are reliable Chinese manufacturers that hit spec reliably. It's just hard to sort the wheat from the chaff.


> It's just hard to sort the wheat from the chaff.

From the people I've talked to, it's impossible without being on the ground in the manufacturing center (and/or hiring a manufacturing fixer who is).


Also, if you order in large amounts test constantly because it's pretty simple to send good stuff for the first few shipments


I've heard that's a problem also. The first run will be agreeable, then subsequent runs are continually "optimized" until someone gripes.


Yes, and those select few supply Foxconn, Compal and Quanta. What you get on the open market is the rejects from those big contracts.


I bought a red cob led from AliExpress to go with some white ones for my plants. I got curious about the red nm frequency being used and dropped the seller a question about it. I loved the reply I got back...

Friend, they are red.

:D


The growth in efficiency has been spectacular. Every other source will be retired in the face of LED greater efficiency, solid state design, and lower cost.

The only light more efficient than an LED is an LED that is turned off. Demand-based lighting, network controlled is the next step.

Cree already sells Power-over-Ethernet powered and controlled light fixtures to do this: http://www2.cree.com/smartcast-landing-page


Now I want to build and sell a diamond nuclear battery-powered, Japanese LED light orb structured so as one LED burns out, a previously-deactivated LED is activated, and a motion sensor-based power switch. Stick into a gnarled length of Australian Buloke wood and presto, a walking stick/staff of light that lasts hundreds of years, shake it to turn on/off, and so hard you can take a massive swing at an orc (or fellow SCA combatant) with spectacular results. Since I can't turn off the battery, I'll make it extract, liquefy and store hydrogen for a bonus fireball shooting feature, but that won't last as long.


You can just fill the sphere with tritium gas and coat it with phosphorescent paint. Radioactive decay will emit electrons that will impinge in the paint, generating light.

Or, you can just buy a keychain or a torch made using this principle. Will provide continuous light for +10 years: http://www.kitmonster.co.uk/product_info.php/products_id/638


Tritium has a half-life of only 12 years.

Which is fine for my Luminox watch, but not fine for a wizard's staff.

Also, the light is pretty weak. You can only see it in a dark environment. Again, that's fine for a watch.


I admit my ignorance regarding wizardry staffs specifications :|


Funny, but I've been in design meetings at least twice as silly as that.


It should be considered that LEDs are now being used for lighting in situations where there used to be no lighting at all [1]. The potential energy savings we could realise with LEDs may be negated by an increase in lighting itself. This would be an example of Jevons' paradox [2].

[1] http://www.lowtechmagazine.com/2008/10/led-light-cfl-b.html

[2] https://en.wikipedia.org/wiki/Jevons's_paradox


> LEDs are now being used for lighting in situations where there used to be no lighting at all

While LED lights have certainly allowed for new applications, that article is nearly 10 years old and the future predicted has not come to pass. There has been no wave of LED-encrusted buildings. The roads don't glow. My bookshelves and whatnot don't either.

If LEDs deliver a 10x improvement in energy use, we'll have to use 10x as much light to offset the energy gains. How likely is that?


>There has been no wave of LED-encrusted buildings.

Lol. Give them time. Here's two examples off the top of my head, in my city: http://urbantoronto.ca/news/2015/07/aura-illuminates-downtow...

http://canderelresidential.com/files/styles/building/public/...


I wish the trend was to reduce outdoor lights in cities instead of increasing them because it looks cool.


> My bookshelves and whatnot don't either.

Speak for yourself :)

I agree, though; I use a lot more lights around the house, but they're small and very directed for their purpose. Many of them are controlled by cheap, ubiquitous PIR modules so that they're only on when needed, with absurdly low quiescent current. I use less light overall because it's only where I need it to be, and I find this to be much nicer on my eyes, and generally pleasing for the evening. The added efficiency is just a bonus on top of all that. Many are undervolted, as well, so should last a very long time; the main concern is the lifespan and quality of the power supplies.


Got any recommendations for the LEDs, power sources, or PIRs?


Have you been to Hong Kong or Beijing?

Lots of led encrusted buildings.


True, but they're mostly decorative spots that hardly consume any light at all. Especially compared to old school flood lights that cast wide glows on sides of buildings.


How many of those buildings had regular lights on them before?


Considering they are all relatively new construction, I doubt that any previous structures had them - this is a new, and awesome, phenomenon...

I freaking love what Asia is doing with their urban environments...

I just wish that the US developers would take note.

We are building the tallest building in SF ("salesforce" tower) and its a mundane peice of shit building. Literally zero aesthetic innovation.


> The roads don't glow.

There are those solar roadway people who are hoping to fix that. (I'm not optimistic, though. :) )


This is particularly true for increased outdoor night-lighting, which is harmful to wildlife: migrating birds, light-drawn insects and the bats that feed upon them, insects that require darkness to signal mates (fireflies), nocturnal prey-hunting animals, aquatic animals whose cycles are set to the lunar cycle -- as well as sleeping humans.


Most of the LED street lighting and whatnot I've seen has been more directed than what it replaced. Light pollution has improved, generally.

For a while the color temperature was terrible, but cheap and reliable warm phosphors have mostly fixed that.

As for private outdoor lighting, I'm not sure that less cost really affects that. People that made poor lighting choices (too much, too cool, too much spill) don't do it because it's suddenly cheap. They just paid more before. The answer to this is likely some degree of local regulation.


Same with heating. Heat pumps are increasingly popular in Norway since they will heat your house a lot cheaper than most of the alternatives. In reality people are increasing the temperature by 1-2 degrees Celsius instead of saving energy.

Read the same thing about four wheel drive cars, where the added safety is eaten up by drivers taking increased risks.

Interesting psychology.


> Demand-based lighting, network controlled is the next step.

Oh good, Auernheimer can wake me up in the middle of the night.


Not sure why you're being downvoted, weev has already shown a propensity for using badly secured connected devices to troll people. And you know that someone will have the creds to their system be admin:admin...

http://motherboard.vice.com/read/hacker-weev-made-thousands-...


Lots of <s>botnet</s>internet-of-things fans on here, I guess ;)


> Every other source will be retired in the face of LED greater efficiency, solid state design, and lower cost.

Efficiency is much less important, if you also pay to heat your home.

That doesn't mean they're superior to LED, but it does change the cost/benefit to make it a little less lopsided! :-)


Incandescent bulbs are not nearly as efficient as other sources of heat.

While, sure it makes the argument less lopsided, you'd be crazy to consider the heat output from bulbs as a benefit when replacing them with LEDs.

Don't forget, in the summer you have to pull that heat out of your house also.


Unless your cost of electricity is so low that resistance heating beats heat pumps and natural gas for long term costs, inefficient lighting is an awful way to heat your space.


Incandescent lighting is more energy efficient when the filament is placed inside the right "photonics crystal":

http://news.mit.edu/2016/nanophotonic-incandescent-light-bul...


It's a really cool idea, although their statements about efficiency seem rather questionable to me. LEDs are only twice as efficient as traditional bulbs? A laughable statement, since I get the same light with 1/5th to 1/8th the energy using off the shelf bulbs.


> LEDs are only twice as efficient as traditional bulbs?

I didn't see that claim in the article, but I thought these two paragraphs were wonderfully specific:

One quantity that characterizes a lighting source is the so-called luminous efficiency, which takes into account the response of the human eye. Whereas the luminous efficiency of conventional incandescent lights is between 2 and 3 percent, that of fluorescents (including CFLs) is between 7 and 15 percent, and that of most commercial LEDs between 5 and 20 percent, the new two-stage incandescents could reach efficiencies as high as 40 percent, the team says.

The first proof-of-concept units made by the team do not yet reach that level, achieving about 6.6 percent efficiency. But even that preliminary result matches the efficiency of some of today’s CFLs and LEDs, they point out. And it is already a threefold improvement over the efficiency of today’s incandescents.

Which details in that do you question?


Comparing theoretical efficiency of new design for incandescent lighting with commercially available LEDs does not make sense, as theoretical efficiency of white LEDs is also around 40%. I'll consider incandescents once the efficiency of commercially available ones reaches the efficiency of commercially available leds. I bet $100 that it will not happen in next 10 years.


Comparing theoretical efficiency of new design for incandescent lighting with commercially available LEDs does not make sense

Yes. They say they are currently at 6% efficiency, which is better than the worst CFL's and LED's, but far from matching the best.

as theoretical efficiency of white LEDs is also around 40%

Yes, although one should probably include the power supply. I think the advantage of the incandescent approach is that there are no losses here.

I'll consider incandescents once the efficiency of commercially available ones reaches the efficiency of commercially available leds.

Sure, sounds wise. Light quality and longevity might be factors too.

But what does this have to do with the claim that the article says "LEDs are only twice as efficient as traditional bulbs"? The article can be read as claiming that the worst commercially available LED's are about 2x the efficiency of the best incandescents, and perhaps this is false, but I don't see this as reason to discount the rest of the details.

Here's earlier discussion of this approach, which includes a link to the actual paper: https://news.ycombinator.com/item?id=11771001


The theoretical efficiency of LEDs is definitely not 40%. LED efficiency has been measured to be 230% with 30 picowatt power consumption:

https://www.google.com/amp/phys.org/news/2012-03-efficiency....


Their claimed efficiency is only sort of true, in that it's measuring the electrical input while ignoring the external heat also used to run the light. By similar logic, one could say that a steam engine running in a very hot room is also over 100% efficient.

Also, the "theoretical efficiency of LEDs" is a very different number than the "theoretical efficiency of white LEDs". I think when talking about white LEDs, they are assuming a monochromatic LED pumping a phosphor.

But while it's not that practical, in that it's limited to picowatts (10^-12 watts), it is interesting nonetheless. Another similarly surprising example is that the theoretical maximum efficiency of some fuel cells also exceeds 100%, for similar reasons of extracting heat from the environment.


Indeed. I went to talk recently where they were saying that some offices now just use network connections for the whole system (including power) as it means they can centralise the driver system in the basement which simplifies everything.


Can you clarify? I don't see how a PoE network would allow you to centralize the driver hardware for LEDs, per spec I understand the power provided by PoE is straight DC with some sort of current negotiation process, and I don't think you could do PWM for LED drive across any great distance.


I don't understand the general use of PWM anyway - you want to use switching regulators anyway, and it is actually slightly easier to do (almost) constant current than PWM.

And PWM lighting is annoying in a lot of ways.


For the padders interested in modding CCFL to LEDs, he wrote https://people.xiph.org/~xiphmont/thinkpad/led-backlight.sht...


Speaking of LED lighting, I've been meaning to upgrade my living space lighting. The one killer feature I'm looking for is color temperature control: I want to be able to adjust between "soft white" (2700K whitepoint) and "daylight" (5000K) whitepoint.

Basically, what Philips Hue "White Ambiance" [1] bulbs offer. Are there any alternatives?

[1] http://www2.meethue.com/en-us/productdetail/philips-hue-whit...


Any reason you don't want hue? I use LiFX Color bulbs (more pricey but I got a great price) and they have been one of those "I don't NEED this but really fun and wonderful" additions to my life.


I am currently using Osram Lightify bulbs with my SmartThings hub (no separate Osram hub required unless you want to update the firmware), and I have both tunable white and tunable white+rgb bulbs that work great.


Santa can I please have a separate HN for those who buy lightbulbs with firmware to post to?

Jesus.


LIFX makes white-only bulbs with color temp control for reasonable prices: http://www.lifx.com/products/white-800?variant=933783339

You just missed their Black Friday sale too :(

I have a few and they work well with my other home automation stuff (broadlink IR+RF remote + Aukey RF outlets + Moteino system).


How are you managing the Broadlink kit? They're one of the only things that can control my aircon, but the Android app is pretty bad.


Have to say the evolution in LEDs has been pretty amazing.

First we had terrible yellow fragile maglites that requires several large C or D cells for anything approaching reasonable brightness and battery life. I had my small maglite die at the bottom of the grand canyon, and the top of MT shasta. After trying to replace a tiny incandescent bulb in the dark in challenging conditions I decided I was switching to LEDs, no matter what the compromise was.

Similar monitors had broad spectrum CFLs that generated UV, converted it to white light with a phosphor then threw away most of the light and filtered it down to approximately R, G, and B. The resulting color accuracy was terrible.

LEDs started out expensive, not very bright, and mostly red. Migrated to different colors, I think blue was the most expensive.

Finally white LEDs started showing up in tiny AA or AAA flashlights with 50-75 lumens. About as bright as the 2xAA maglite that were huge in comparison.

Color coverage got better, life got better, efficiency started ramping up.

100-200 lumen flashlights started to become more common. 18650s that have substantially more power and a more LED friendly 3.7V started to appear.

Year or two later 400-500 lumens were common, efficiency kept increasing. Started to appear as backlights, car dash boards, brake lights, etc.

CFLs switched from wasting 95% of the light to ditching the white phosphors for a mix of RGB phosphors. The savings were twofold and multiplicative. The first was generating UV -> RGB directly with mix of RGB phosphors in the CFL. Then additionally instead of narrow notch filters that didn't pass much light they switched to very wide spectrum filters that passed most of the light. So a red pixel didn't need to be within 1% of red, but instead just block 99% of G and B. Since the RGB phosphors emitted very close to the ideal frequencies the color accuracy increased. Monitor power requirements dropped significantly.... only to be killed off by more efficient LEDs.

Things doubled again, small ish flashlights much like the 2xAA maglites except fatter. 1000 lumens hitting $50-$100 points points. Now throwing a usable light 300 Meters isn't unusual from a small hand held light. Depending on your use recharging once a month isn't unusual. 4-or 5 light levels are common, and runtimes of 1.5 hours (max light) to 100 or more hours (on low) aren't uncommon.

So now macbook pros that are extremely thin (greatly helped by very tiny LEDs for backlights) get 10 hours of battery light with 500 nit screens with wonderful color accuracy. Similarly even 55" LED screens are crazy high resolution and crazy cheap. The cost per pixel doesn't seem on that crazy different than the cost per transistor.

Sadly like Moore's law, LEDs aren't getting much better anymore. For the first time I purchased a new flashlight after 2-3 years... and the best I could find was zero % brighter 8-(.

It's not just computers that have been improving like crazy.


>Sadly like Moore's law, LEDs aren't getting much better anymore.

The best LEDs are now remarkably close to the theoretical maximum luminous efficacy. We're now bottlenecked by batteries and the limits of sanity.

Specialist diving companies will happily sell you a 60w, 6300 lumen flashlight. It's not a big market, but it exists. To run a light like that you need a massive external battery pack tethered with an umbilical, because it'll eat through the entire capacity of an 18650 cell in under 10 minutes. That's no problem for divers, because a properly designed battery pack is close to neutrally buoyant; it's a major inconvenience for people on the surface.

1000 lumens is Good Enough for 99% of users of handheld flashlights. There really aren't that many people who need to throw a beam for half a kilometre or light up a football stadium with a flashlight.


> The best LEDs are now remarkably close to the theoretical maximum luminous efficacy.

Do you have any pointers/references where I could read up on this? I'm designing home lighting with power leds and finding that despite all the hype, it's quite difficult to beat the efficiency of TL5 HE fluorescent tubes with LEDs. I also noticed that while the output in lm/W has been increasing, it seems to increase very slowly these last few years.


Depending on acceptable bandpass and desired color temperature, a white light has a maximum luminous efficacy of 250–370 lm/W. Nichia currently offer a white LED that achieves 210 lm/W at >80 CRI. Cree have high-flux LEDs capable of ~200 lm/W at >70 CRI.

There's room for improvement, but nothing revolutionary. Cree had a prototype device achieving 303 lm/W in 2014, but it isn't currently commercially viable. Lower efficiency LEDs still make up a large proportion of the market for simple reasons of cost.

https://arxiv.org/pdf/1309.7039.pdf https://www.nichia.co.jp/specification/products/led/NF2W757G... http://www.cree.com/~/media/Files/Cree/LED%20Components%20an... http://www.cree.com/News-and-Events/Cree-News/Press-Releases...


Thank you — this is much-appreciated information and interesting reading.


> There really aren't that many people who need to throw a beam for half a kilometre or light up a football stadium with a flashlight.

And for most people, a product that can do that is substantially worse than one that can't. If you want a flashlight to play tag, walk the dog, or wander your house in a power outage, you probably can't use a dive-strength light.

We're finally hitting the point where even modest, internal-battery flashlights can be so powerful that they're painful to look at, and for most people you don't want to go any brighter than that.


That's not a major problem. It's trivial to let it click to a different brightness, and then the battery will last all day.


I wish. Flashlights tend to have all sorts of strange UIs, just like television firmware are just exploring the possible ways of being annoying.

None of mine have both a way of getting full power fast /and/ deterministically turning it on on the lowest setting.

(Except for the one that only has 'very bright' and where I first wondered why there is no rebound when I turn it on.)


> I think blue was the most expensive.

yes. Which is why to this day so many "cool" accessories insist on blue LEDs as a power indicator making them completely useless whenever too much light would actually be harmful to the experience (like the Playstation Power LED, or any other power LED on devices you have in your bedroom).

The blue LEDs feel so incredibly bright compared to all other LED types to the point where they become very distracting.

Blue light isn't cool any more (I would argue that it never was in the first place). Stop adding blue LEDs to all your devices. Please.


There is also increasing concern about eye health risks involved with blue LEDs, so tread carefully. I recently had to swap vendors on a project because the cheaper blue LEDs threw too much of the hazardous spectrum. The EU is way ahead of the US on this.

http://www.cree.com/~/media/Files/Cree/LED%20Components%20an...


I had no idea, thank you!


I can barely look at existing blue LEDs these days, which sucks because they're on every emergency phone beacon on college campuses (which I live around).


Too late. Semiotically blue now means 'power LED/main button', so monitors, computer cases, etcetera all have one blue LED for the power button. Even your toaster has a blue LED on its on/off-switch.

Changing this will take years. Perhaps premium brands will start making red power LEDs hip again, and we will enter a period of red LEDs (again). On the other hand, violet power LEDs might become fashionable, so count your blessings.

I was amused that the HTPC computer case I chose had a small slider just below the blue power indicator; you can slide it to the right, and the blue is no longer visible.


"Too late. Semiotically blue now means 'power LED/main button', so monitors, computer cases, etcetera all have one blue LED for the power button. Even your toaster has a blue LED on its on/off-switch."

Red/off, green/on (stop/go) has a much, much deeper symbolic link and would be easy to adopt and be understood - no matter how entrenched blue LED symbolism has, or will, become.


The threat with red, though, is that it also has a strong symbolic link to "error". If you power down your console and suddenly see a red light, it conjures up awful memories of the Xbox 360.


Probably not ideal for the significant chunk of the population with red/green color blindness though.


For the sake of both efficiency and reason, "off" needs no color of LED at all.


A customer currently wants to return a 12 bay NAS because the blue power LED is too bright for their office.


It might be a good proxy for quality. A company that used such devices, that they made, would almost certainly have a problem with the eye-melting brightness of the LEDs. A good company would be expected to correct it.


Did you tell him to put a piece of tape over it?


ThinkGeek used to have it, I found on amazon:

https://www.amazon.com/LightDims-Original-Strength-Electroni...


I just use gaffers tape. Edit and if you need to see the light sometimes the I make a flap by folding it over so it covers it up but some parts are not sticky.


Another idea is to use a black Sharpie on the offending LED. Blocks most, but not all, of the light.


This is a really clever trick, I'm surprised I haven't heard it before.

"Hide your indicator lights" really isn't reasonable, but marker or masking tape would be good candidates for blocking 80%+ of the light.


My NAS has a brightness control for its LEDs. You can even set a schedule so they're off at night only...


I think you might be overstating the strength of the association. Around me now I have 3 power buttons with white light [0], and 2 with blue light [1]. The color isn't very important because the power symbol makes the purpose of the button obvious.

[0]: a Dell tower PC, a MBP with power button on the keyboard that is backlit like any other key, and a Dell monitor

[1]: two other models of Dell monitor


I have blue, red, white, green power LEDs pointing at me now. Green is my preference for fully powered on indication.


Some high-end monitors provide control over the power LED. My friend's NEC monitor even has a brightness control for it.


Once you're at the point where you have to further decrease your margin in order to make some purely cosmetical feature configurable, maybe it's time that you re-think whether you really need that feature in the first place or not.

Or in this case specifically, maybe it's time to rethink whether you really need a power LED on your monitor as it will be undoubtedly obvious whether a monitor is in-fact powered or not by just looking at it.


> as it will be undoubtedly obvious whether a monitor is in-fact powered or not by just looking at it.

Will it? Is the screen dark because it's off (janitor bumped it at night), because my laptop didn't wake up when I docked it, or because I forgot to plug the cable back in after fooling with an SBC on that monitor yesterday?

The power light makes the state of the monitor obvious, and so does an image displaying on it. The light's useful for the edge cases when there isn't an image being displayed.


That reminds me that I covered the bright blue LED on the bottom bezel of my ViewSonic LCD with electrical tape a couple years ago. I can't believe nobody at ViewSonic thought it would be a bad idea to focus a bright blue LED towards the user of a display.

I just double checked to make sure, no LED control on this monitor :(


I've got some low-end no-name from about 2008. It's got 3 layers of paper taped over the blue power light to make it reasonable. There's also the Antec Sonata case from about 2004...it's got 2 super-bright blue LEDs with reflectors. Here's a wonderful idea: design a case to look understatedly elegant, then give it two bright blue headlights. I disconnected them after the first night.


This is a real shame. Dim red lights are the hallmark of standby/warning for good reason - they're non-distracting and comfortable to have lots of. It's a real shame that blue took over when it's probably the worst color available.

And as an aside, having a light that turns on to tell you when a product is off/asleep is just offensively bad design.


Presumably it reduces the number of support calls solved by "did you plug it in?"


Comfortable to have lots of warning lights?


Red is still best because night vision... that's why bedside clocks are still mostly red.


They are very annoying. I have several layers of electrical tape over most of my electronics--one layer is usually not enough!


Electrical tape does the job in one application. You can even buy matching colors if standard black contrasts with your device/bezel/case.

edit: sorry, did not see 'electrical' in your post. What device has luminosity to shine through e*tape? My electric toothbrush can illuminate entire bathroom, hall and most of my attached bedroom. 1 strip cured it.


HDD light on a run of the mill tower PC case. The random blinking makes it particularly bad in a dark room. I could see the whole room light up with HDD activity through one layer of e. tape.


I literally put tape and nail polish over any blue leds. Leave it so just a little light gets out and they become tolerable for non-bedroom use. All bedroom leds are duck taped over and phones are put led side down.


> I think blue was the most expensive.

When blue LEDs were first available in sample quantities, the R&D Lab in which I worked ordered a few 5mm blues for £35 each - no free samples available!


The blue light were brighter - they have a mechanism different from the old LEDs that is much more efficient. I guess the designers just dropped in the blue leds, and voilá, terrible bright lights.

The other colors later followed suit - the brightness you now get with a few mA is amazing when you know the old LEDs.


Great comment. It reminded me of the various technologies that have matured over the last 10-20 years which at the time were exciting but are now standard fare: Bright energy-efficient LED lighting, touch screen phones, optical image stabilisation, SSD's, large LCD & OLED display panels, fast internet, wireless networks. Now intelligent sous vide circulators the size of handheld blenders are cheap enough for the average person to try it out, electric cars are desirable, full-frame and medium format mirrorless cameras are becoming more and more affordable, phase-detection is expected in phone camera sensors...I could go on but when you stand back and look at the body of advancement from the last 10 years you'd need to be crazy to be underwhelmed. We've seen rockets lands on drone barges in rough seas.


> It's not just computers that have been improving like crazy.

Personally I feel like advances in computers have been creeping along at best in the last ~five year period. Ever since Intel beat the shit out of AMD with the Bridge architectures and the x86 market became a de-facto monopoly.


Personally I feel like advances in computers have been creeping along at best in the last ~five year period.

Depends which spec you are looking at. While not much has happend with flops/core, flops/Watt has improved dramatically in the low energy space.


This. Current generation laptops are finally fast enough to be my primary developer experience without feeling like I'm sacrificing performance. I'm still sacrificing screen real estate and biomechanical efficiency with the crappy keyboards, but the compute is finally there.


Came to say the same... the improvements seen lately have concentrated on power efficiency, one of the few areas where other platforms have an advantage in some scenarios...

This has kept the server space rolling over a bit, which is somewhat nice, it's also helped laptop battery life immensely. There's not as much need for true desktops, but I imagine the nuc-class will continue to improve over the next few years.

It just depends on your needs... Also, the GTX 1080 is impressive to say the least, and still a bit under-powered in gaming at 4K (which is what is now pushing that space). Encoding to h.265 is still really intensive as well.

The use cases where more power is needed are eroding, but there are still a few, and the use cases where the power usage needs to be lower still are many.


To be honest it feels like "advances" in computers have been speed, cores and little else. Ever since the abandonment of the last non-PC platform. It might get interesting again when ARM is directly competing against x86 - but that's due to the advances of handhelds/phones.

Lots of cores is all very well, but a lot of applications still can't utilise them that well.

The last time computing hardware felt exciting was late 90s, or early 00s when there were half a dozen graphics players, SPARC, PowerPC, 680x0 and the odd strange idea like NeXT, transputers and BeBOXes.

That we care more about the case, the interfaces and the coloured LEDs says we're firmly past the innovation era.


> Lots of cores is all very well, but a lot of applications still can't utilise them that well.

Right, which is the real issue I think. It's not that we can't make faster computers, it's that we've run out of good software use cases, just like the limiting factor on how good videogames look is no longer how good the engine and rendering are but rather how good the artists can make the textures and models.

There is still innovation going on - intel's recent hardware transactional memory support is very exciting - but for the most part we simply already have more CPU capability than we know what to do with.


That and so much of our computing is now using all that horsepower to play thin client.


I think we might see more cores (and other innovations) becoming useful for offline AI/ML purposes, as well as virtual reality applications. Both right now are mostly dependent on GPU cores (the more the merrier), but can be bottlenecked by the CPU usage (depending on the task, of course - games tend to want more of everything). Though both of those might stay niche areas, since most people are moving to mobile devices and any need for more compute power can come from "cloud resources".


Probably a more interesting development is secure memory (Intel SGX, AMD SME/SEV), because that has the potential to make virtualisation / cloud offerings far more secure - it effectively allows a VM to enforce that the hyper-visor / service provider cannot read it's memory. Effectively barring the SP from being a few commands away from root on every hosted system.


Yeah, seems like Intel's been cruising easy. Sandy Bridge, Ivy Bridge, Haswell, Broadwell, Skylake, and Kaby lake have been pretty small increments for general purpose computing.

Does seem like AMD has scared Intel into over drive. Seems like all price points are likely going to get significantly more cores in the next generation after stagnating for what a decade? The normal high end desktop has been quad core since way back at the Q6600 in 2007. Desktops with 8 cores/16 threads and servers with 32 cores/64 threads seems to be Intel's answer to AMD's Zen.

Here's hoping AMD delivers on promises and is competitive.


> Does seem like AMD has scared Intel into over drive. Seems like all price points are likely going to get significantly more cores in the next generation after stagnating for what a decade?

Another factor might be that the software industry is finally adapting to multi-core being the new normal. I mean sure, server software's been forking or multi-threading for ages, but I get the impression the bulk of desktop software did most of its work in a single thread until fairly recently - so having more than 2-4 cores wasn't really worth it, it would just have increased cost without making things feel that much faster for the average user.


Not only that, but going with more cores hurts in a variety of ways. It's harder to get a wider chip clocked up as far, so single-thread performance tends to suffer. It's hard to manage clocks and scheduling appropriately especially when you have non-uniform processors, like different turbo capability (max multiplier) between cores or even worse big.LITTLE type architectures. It's very easy to end up ping-ponging between high power/low power modes, and battery life suffers.

If it were physically possible, it would have been strongly preferable to have just kept increasing clocks instead of going wider. That was a decision made out of necessity, not choice.

You see big gains from being able to multitask a little, past that it's really diminishing. All things being equal, a 2C2T or 2C4T at 6+ GHz would be strongly preferable to 4C4T or 4C8T at 4-4.5 GHz for most desktop use. The gains aren't that steep in the real world, of course. The reality is that it's more like a choice between 2C2T at 4.7 GHz (G3258, being a bit gracious here), 4C8T at 4.4 GHz (6700K) or 6C12T at 4.1 GHz (5820K). You can push all of those numbers up a little bit farther by pushing the voltage, but that's roughly the shape of things at stock-ish voltages.

I diverge from the common wisdom here, given those numbers I think the 50% increase in real cores from jumping to 6C12T justifies a 10% loss in single-thread performance for many users, especially given that they're essentially the same price. In the threaded tasks where Hyperthreading provides gains, the extra cores will really push you ahead. If you're just gaming or browsing, a 4C4T is really all you'll ever need (and realistically many people will be fine with a 2C4T or 2C2T anyway). So the 4C8T are really an awkward in-between to me, I really think it's mostly that people don't realize that the 6700K isn't "top of the line" and there's a whole other enthusiast socket out there.

The server market's a little different of course, you can go much wider, but good single-thread performance still beats parallelism in response times. Really wide architectures like Qualcomm's 24-wide ARM server chips still have yet to really prove themselves except in throughput-oriented/latency-insensitive applications (which are the easy gimmie for slow-and-wide designs).


I very much agree with you.

Another interesting observation is that, on cache-coherent systems, the maximum possible clock rate also tends to go down as soon as more than one socket is used.

> (which are the easy gimmie for slow-and-wide designs)

Nowadays they are more of an historical anecdote, but some time ago Sun created the "niagara" line of barrel processors - a technique that is a form of simultaneous multi-threading, like Hyper-Threading, just with 8-16 threads / core.

These UltraSparc T1, T2, ... processors held quite some throughput benchmarks - but were obviously not feasible and not meant for general purpose use.


> Another interesting observation is that, on cache-coherent systems, the maximum possible clock rate also tends to go down as soon as more than one socket is used.

Which is perfectly sensible if you think about it. No matter how fast your cache coherence mechanism is, it's slower than not sharing cache between two chips. It's fine if you are storing program code or independent data, but if you are operating on the same cache lines as another processor you have to flush the dirty lines over and possibly roll back a whole bunch of execution that might have been done there.

Sometimes this even happens when the data you are working on is different but shares a cache line (false sharing).

> Nowadays they are more of an historical anecdote, but some time ago Sun created the "niagara" line of barrel processors - a technique that is a form of simultaneous multi-threading, like Hyper-Threading, just with 8-16 threads / core.

We do a lot of contracting with a state government and I'm pretty sure they're still running these in their datacenter. It's some flavor of SPARC for sure, and I was talking to our sysadmin and he mentioned that they used a processor that I remember was a lot wider than hyperthreading, which sounds a lot like UltraSPARC.

From what my guy explained they are pretty nice in database workloads especially since they're also an Oracle DB shop. They are locked into Oracle products seriously hard and they're unwilling to contemplate moving to commodity products. Probably unable to, given their staff's skillset and in-house operating procedures. It shouldn't take a tiger team for someone to suggest restarting a misbehaving server. You get what you pay for though. The state doesn't pay much and people who have other options don't stay in that kind of work environment.


I think Nvidia with their 512 core Xavier chip is also taking some of Intels thunder.

I wonder how long before bits of linux are running on the GPU? Once the process of porting to GPU more and more code will run there.


I very much doubt that. Of course someone might do it for kicks, but not for serious use; GPUs already have a task scheduler built into them, and compute cores don't like branching code - less so if it only needs a couple cores, like a general-purpose OS.

There is little reason to waste them (and compromise efficiency) for Linux or any other OS.


> Yeah, seems like Intel's been cruising easy.

I don't know, their plans have stalled (tick tock is gone and buried), Iris is being killed and their rollouts are complete shit-shows (even ignoring Xeons, there were 9 months between the availability of the first and last Skylake mobile parts). Maybe that was intentional because there's no competition (it's not like AMD is much of an option these days) but that seems a bit odd.


Iris isn't being killed. The new macbook pros are using Iris, there's a few NUCs with IRIS, ideapad 710s, a few sonys etc. Don't forget that the IRIS parts are a half cycle behind the non-IRIS.

Basically once intel supply catches up with demand, bugs fixed, 1 rev of silicon after production, BIOS tweaks, etc. Then intel starts pulling wafers out of the supply, adding a extra chip inside the cache for the extra GPU memory bandwidth, and starts selling the IRISs.

I'd expect the Kaby Lake Iris to be out in q1 or Q2 of 2017.


> Iris isn't being killed.

The KL roadmap reduces Iris to just some dual-core 15/28 U parts, and is adding HD-based quad-core U parts alongside that. The trend is clearly towards removing Iris.

> The new macbook pros are using Iris

These are skylake parts, and the 15" only uses HD.

> Don't forget that the IRIS parts are a half cycle behind the non-IRIS.

I'm not forgetting anything, I'm looking at the roadmap.


Where is this idea that Iris Pro is being killed coming from? How likely is it?

It just seems strange. They've been touting their Skull Canyon NUC with Iris Pro pretty heavily towards gamers and to be honest at that size, Iris Pro is exactly what I want for a lounge room gaming and 4K movie PC.

It just seems like such a shame; why kill Iris Pro when it's just actually becoming desirable?


Iris pro might die, I've seen rumors along those lines. But the high end iris offers little and doubles the price.

Compare the NUC with iris 580 is $600-$700 ish. The NUC with the iris 540 is $340 and often they throw in 8GB ram for free. 580 is hardly a gaming GPU, it doesn't really run much that you can't run on the 540. Sure it's faster, but far from an AMD/Nvidia killer.

So the iris pro might die, but I expect the iris line to continue.

Certainly the popularity of the iris based mac pro will help. Hell I wish more of the skylake iris's were easy to get. I've been hoping for an Iris 550 based NUC or similar system for a quiet desktop. I've yet to find one.


> But the high end iris offers little and doubles the price.

The MSRP difference between the 6820HQ (2.7/3.6, HD 530) and 6870HQ (2.7/3.6, Pro 580) is 12%: http://ark.intel.com/compare/88970,93340

The former is $378, the latter is $434.


I as comparing the NUC systems. Kinda weird to compare CPU prices when you can't buy the CPU as an individual, and those that can buy them typically get better pricing.

What an end user can buy is a NUC, the iris 580 is something like $650 list, $575 street. The Iris 540 version is $400 list, and around $340 (but often with a "free" 8GB ram.

Iris seems generally a failure on the desktop, but seems fairly popular in mini pc's and laptops. Apparently the NUC was one of the few success stories in pc desktops for the last year or two.


I hope AMD prodcues something like Skull Canyon when they release their Zen based APU's. If Zen is competitive this should be really nice since the GPU's from AMD are better.

Ideally with a higher tier version that uses HBM so the GPU is not held back by the memory bandwith.


Sadly the rumors claim the first gen zen will not be APUs.


Maybe we simply hit the point where CPUs work well enough to serve their purpose and where the physical tradeoffs make it impractical to extend Moore's law.

Proper credit: Maciej's observation, not mine: http://idlewords.com/talks/web_design_first_100_years.htm


The whole premise of the article relies on people actually wanting to retrofit thinkpads with new backlights - which is directly helped by the slowness of advances in computers.


> LEDs started out expensive, not very bright, and mostly red. Migrated to different colors, I think blue was the most expensive.

I love that you can date a lot of electronics by the colour of the LEDs (anyone still own an alarm clock with a red display?). As soon as a new colour was possible/cheap enough the last 'hot' colour was dropped quick-smart.


Just last year I replaced my nearly 20-year-old red LED alarm clock. That thing was built like a tank.

Finding a new one with LEDs in any color other than blue was essentially impossible. Which was especially infuriating because I specifically wanted to avoid adding more sleep-disrupting blue light to my bedroom.


I recently had to replace my ancient Sony alarm clock. I found one that used an LCD with a backlight that wasn't blue. And of course, it was still too bright.

So I put some window tint on it. I picked up some samples with different transmissivity from the local shop that did my car, and mixed & matched until I got what I wanted, then applied them.


Thankfully there do exist alarm clocks that specifically cater to this problem. Philips had an alarm clock that would light up slowly (using a halogen bulb) and optionally play the radio or (oddly enough) the sounds of cows in an Alpine meadow, or a Buddhist gong, or birds chirping (the radio turned out vastly preferable).

The clock display was backlit by orange LEDs and seemed even less noticeable than the classic red LED display you mention.


Philips still makes those kinds of clocks. The "light up slowly" comes from LEDs, now, and the color of the light ramps up from red, through orange, eventually to white, over the course of a (user-configurable) timespan. They're pretty expensive for an alarm clock. I have one (it was a gift), and I gotta say, it's a way better way to wake up than any cell phone alarm could be. Very gradual and gentle.

The phrase to search for is "Philips wake up light", on Amazon or whatnot. They make a few models, but the prices start around $60.


I bought one last year, and quite hate it, compared to my other (10 years old) alarm. While the waking up by light is nice, the UI is terrible, the clock doesn't even keep the time right, it doesn't have weekend mode (so it woke me twice when I could sleep in). Its almost surprising how much they could mess this up, and its not cheap at $100 for an alarm clock.


Ha! This was exactly my experience. I just assumed the problem was that I'd bought it used with no instruction booklet, but wow it was lousy!


ours works great though it's fairly non-intuitive; whenever the time gets messed up my wife asks me to fix it and it always takes me a few tries to figure out what does what again.

Seems to keep time fine, nothing turns on randomly, etc. Not cheap though.


I just use my phone for that :)


I have Philips Wake-up light and I hate it: radio is terrible, no week scheduling, light volume knob broke after year or so, it can powerup itself with radio in the middle of day or night for no reason, it does not keep time accurately (for such huge price), and so on. But clock display is orange and dim (with 3 level of dimness), so I use it as night clock only — it really good for that.


I had one of those, and eventually replaced it with one of these lamp controllers:

https://www.amazon.com/Lighten-39-95-Uses-Your-Lamp/dp/B00AX...

The time settings were much better than the Philips, which had sort of an awful UX. Also, you can have a longer ramp up, which better approximates a real sunrise.

For the lamp, I used a 3 bulb floor lamp, with Satco halogen lightbulbs (the kind that, for ~70 watts, replace a 100 watt incandescent). It's a higher watt draw, but I'm only running it for an hour or so each morning.


my cats went APESHIT the first time the birds went off.


I chose a Braun clock, since the LCD backlight can be switched off, leaving it nice and dim, and the projection onto the ceiling is can be set to a dim red.

I'm not sure it's a feature, since the apartment is rented and I'll probably need to pay for the repair, but I dropped it on the floor, and it has left a fairly obvious dent.

http://www.braun-clocks.com/clocks/digital/bnc015-rc-digital...


Red, yellow and 2600K white are my ideal colors for night time. Green LEDs make too much light due to human's increased sensitivity to the color and blue has it's own problems.


Outside of specialist use I really don't think they need to get better now. I have a promotional cheapie LED torch, maybe $3-5 worth, that beats any of the Maglights I've ever had. The little camping lantern I have is ridiculously good. The decent torch is so unbelievably bright that torch doesn't really cover it any more. Sure it uses batteries at top output.

Tungsten is for oven lamps, specialist use and little else.

Now I'd like to be able to buy 10 year life household bulbs that last and preserve something resembling a colour temp. A set of LED daylight bulbs I bought 2 years ago are down to 1/2-2/3 original brightness and all except one are spread around the "warm white" territory.

By 10k hours they'll be down to nightlights. Realistically they'll have been replaced long before.

Half the desk lights on Amazon, including expensive brands like Luxo, are using LED to mean fixed "expensive manufacturer replacement part", rather than bulb.

I'm hoping for a continuation in solar and a breakthrough in batteries so we can tip away from fossil at similar rate.


I've a bunch of LED flashlights, powered by 18650 cells. I've one charger, and one extra 18650 cell; the unused cell is always sitting in the charger; when I need a fresh battery in any flashlight, I just swap cells.

The flashlights are cheap stuff from China, $10 or thereabouts. The cells are about half that price. The charger is about same price as a flashlight.

The damn things are blindingly bright. The whole family has a 18650 flashlight on their bikes, and I always get comments from people when I'm biking through the neighborhood. Oh, and the devices are tiny, the barrel is just big enough to contain the Lithium cell.

In terms of amount of light, I'm not even sure I want something better. Seriously, they beat the pants off the antiquated Maglite candles in every way.

The current regulator inside is not resistive, it's a switched regulator with a frequency in the audible range. If it's very quiet in the room and you press the barrel to your cheekbone, you can hear a very faint whine from the regulator.


> The whole family has a 18650 flashlight on their bikes

Please make sure the beam is directed if you do this. I've seen many people riding around with lights that are quite blinding to oncoming traffic, especially on dark roads. Sometimes just pointing regular flashlights down can solve this, but not always.

The US hasn't caught on yet, but countries like Germany have mercifully begun to regulate this.


And sometimes they'll have their super bright light on strobe. Bikers out there, if you want to make sure I can't tell your distance or heading, put your ultra bright light on strobe.


Strobe is fine during the day. At night it's a stupid idea. Nobody can see anything anymore, including you.


The whole "everybody drives jacked up SUVs" thing we have here makes it worse.


What's 100x worse is folks that take a 300-500 lumen flashlight then put it in flash mode to save the battery. Practically blinding everyone else on the road.

As part of my public duty I switch my 1000 lumen flashlight into flash mode which often causes audible complaints. It does seem to get my point across.


Most bike lights I've shopped don't have a cutoff. And you HAVE to aim them at the horizon because otherwise the hotspot will destroy your night vision and make it impossible to ride at more than a few mph.


In tech diving there came a tipping point a couple of years ago where LEDs replaced HIDs almost overnight - the wait was for reflectors that could shape the light properly. It was quite sad to see people trying to sell their expensive HID heads for even half of what they originally paid for them, when they were basically worthless now.


The exact same thing happened in cycling, particularly mountain biking. Here in Michigan starting with the time change, 50% of riding (for those who work) is in the dark.

Not only did LEDs make it cheaper and easier for us to get lights, the cost is so low that there's a huge increase in the number of people riding after dark. This is a really great thing, since having more people out in the woods after dark makes it overall safer, more welcoming, etc.


Are you talking about headlights? I belive the HID lights (at least aftermarket) are complicated by needing more power and/or more voltage than is usually available.

Do LED bulbs fix that and work with stock power systems in a car?


>Are you talking about headlights?

No, scuba diving. Tech divers who operate at extreme depths or in overhead environments need powerful flashlights. They need to be reliable and redundant, because you really don't want to be stuck in a cave system or a shipwreck with no source of light.

HID was the gold standard of dive lighting for many years. The thermal issues that accompany HID lighting practically disappear underwater, because you're floating about in the world's biggest watercooling system.

As gaius said, LEDs suddenly overtook HID lighting a few years ago. The brightness, light quality and beam shape of LED lights matched HID, at which point the changeover was inevitable. LEDs don't explode if you drop them on the dive boat, they turn on instantly, they have consistent color temperature and they're efficient and easily dimmable. HID and HMI still have a role in underwater photography, but it's a shrinking niche.


I need new flashlights and you seem more knowledgeable than the average Amazon reviewer -- give us some links!


It takes a lot of research on candlepowerforum or budgetlightforum to start to get an idea of what features you like and what's important to you. I've found the best quality, smallest, most efficient flashlights are from Zebralight. http://www.zebralight.com/ I've got about 30 flashlights, and the 5 Zebralights are my favorite by far.


Agreed on Zebralight. Rock-solid construction, very compact and very powerful.



If you're looking for a headtorch, then I can recommend the LEDLensor XEO19R - https://www.ledlenser.com/uk/products/headlamps/xeo-series/x...


candlepowerforums


Heh, yeah, that's the center of the crazy flashlight nerds. For keychain stuff it doesn't matter much, anything with a few buttons, AAA, or AA will do the trick.

If you want to see across parks, tops of distant trees, flag down helicopters, etc then get an 18650 based light.

Fenix is a good place to start, but there's many companies that made good lights. Even $50-$60 buys a hell of a light .


+1 on Fenix, Eagletac is another great reasonably priced brand that I've found held up well over the last few years.


I was pleased to learn that I could get a replacement "bulb" (more of a bare chip mounted on a bulb base, really) for my old-school Maglite -- the type that takes multiple D batteries, has a focusable beam, and will serve as an improvised bludgeon if need be (because it's made of honest metal).

While I haven't made any formal measurements, the result is several times brighter than the old bulb, and of course the batteries last many, many times longer. Recommended.


LEDs are an adventure.

5 Years ago I replaced all my light bulbs with LEDs, but it was not easy.

First I had to find a brand that didn't sell this clinically white light emitting crap and then I had to buy 3 times as much as I needed and send 2/3 back because not all emitted the same color.


I tried various different LED bulbs many of which did not pass the wife's standards for color temperature. Eventually settled on the products Costco sells which are coincidentally also quite inexpensive.


It's trivial these days - just go to Home Depot and pick up a few models and see what you like. They generally are pretty consistent now.

That said, some people are absolutely insane when it comes to color temps. I learned this the hard way helping friends covert to LED - the pickiness is something I never knew existed and gave me a new appreciation for how much people care about "their" version of light.


It's a lot easier these days. Still takes buying a couple different options so you can get the color right, but they're consistent.


I've had Ikea LED light bulbs for just over a year now. They're cheap enough, and labelled with useful units, for example 400 lumen, 6.3 Watts, 2700 Kelvin, 25000 hours.

The dimmer ones were made in China, but the brightest ones were made in Germany.


I put his conversion kit in my X61t and it is whoa-Nelly bright and the colors are spectacular. I recall that it also costs a fifth what a genuine ThinkPad CCFL backlight module costs.


Those daylight backlight upgrade kits are nice. We boat nerds always lust after stuff like that, because we like to take computers out in the sun (and rain and airborne saltwater) and this is one of the key parts for doing that.


Speaking of LEDs, I've been on and off looking for 0402 or 0603 pink LEDs. There seem to have been multiple manufacturers building them years ago, but it seems like they've been discontinued/obsolete. Now I can't seem to find them at the larger distributors (Digikey/Mouser/ etc.). And I'm leery of buying them off of Ali-express/ebay, because I wouldn't want to have to test them each before using them, and for this particular project it wouldn't be worth it to rework if we had to replace non-functioning LEDs. Anyone know more about the trials and tribulations of pink LED manufacture?


I don't know the reasons behind the shortage, but if this is a one-off thing, TME still carries some: http://www.tme.eu/gb/details/osk40603c1e/smd-colour-leds/opt...

By the way, wouldn't it be feasible to just mix red and blue, either discrete or in RGB packages?


Thanks for the find. I want to cramp about 100 of these as close together as possible.


Go for it before they sell out! :) TME is our primary supplier and from my experience they are not exactly stellar in restocking components when they run out.


How much was a strip 5000 LEDs?


$1720 delivered, and that's been fully recouped.


Can someone explain what it is he was trying to do, as in, why did he want such a small batch of LEDs for?


It sounds like he's replacing the old LCD back-lights in Thinkpads with LEDs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: