Excellent point to bring up, James! I'm averse to LEDs for just this reason; the blue light they emit is part of their very nature, and even though they can be doped to appear to be "white", the blue component is still present to a greater or lesser degree, and blue light at night isn't all that welcome from a physiological standpoint (I would add from an aesthetic standpoint, as well).
Try looking at the spectrum of very high CRI LEDs like the
Nichia Optisolis. The blue spike is very tiny, especially in the 5000K version.
The effects of blue light have long been noted, both scientifically and anecdotally. Regarding the latter, one item that I recall reading years ago, in a book on hypnotism of all things, was that the color blue was soporific; it was observed during the construction of European cathedrals that painters who were using primarily blue paint tended to fall asleep at their easels much more frequently than those using other colors. That would seem to counter the prevailing wisdom that the color tends to keep people awake, but then there is probably much more at work here.
Except it doesn't. I'm a night owl. I often don't go to bed until sunrise or close to it (no, I'm not a vampire in case that's what you're thinking, although members of my family are). There's a sh*t ton of sunlight coming in through my bedroom windows. I have no problems falling asleep, or getting a good sleep. Also, the spectrum of natural light at night is heavily tilted towards the blue. You have moonlight at ~4000K. The brightest stars tend to be the bluest thanks to stellar physics. The average CCT of nighttime starlight is probably well into the 6000Ks at least. And our scotopic vision peaks in the cyan range, as opposed to yellow-green for our photopic vision. These things all point towards blue light giving us cues to sleep, not wake up. In light of this, of course painters using more blue tended to fall asleep.
I personally find blue LEDs relaxing. The key here is intensity. A phone screen blasts your eyes with light at a very high intensity. Even if the back light was incandescent, it would keep you awake. Maybe instead of subjecting people to funky yellow-tinted screens to counter excessive blue light, we should keep the CCT the same, but reduce the intensity to the point where the screen is only just readable. Problem is many phones just don't go that low. If you compared a yellow biased screen at normal intensities to a regular ~6500K screen at the intensities I suggested, you would find the blue light output of the latter was a fraction of the former.
The problem isn't the spike in blue, it's the overall energy output in the blue spectrum. Reduce the intensity, and that drops below the threshold where it has any effect on people, even if the spectrum is still spiky. Proof? Starlight doesn't wake us up, despite being heavy in the blue part of the spectrum, because of its very low intensity.
One other thing regarding the color of light at night: Thomas Edison was quite careful in his development of the original electric light bulb to ensure that the spectrum emitted was as close to that produced by a crackling fire as possible; I don't have his exact quote at hand, but it was something to the effect that humanity has been sitting around fires at night for millennia, and that we are adapted to the range of colors that open flame produces, and he thus wished to emulate that spectrum as closely as possible.
One wonders, if the first light bulbs had originally been of a cold, bluish tint, whether they would have received such rapid universal acceptance.
I tried searching for such a quote without luck but in all honestly this sounds like one of those urban myths. For one thing, incandescent is much whiter than fire, even if it's still heavily biased towards the yellow. For another, before incandescent, we already had arc lamps. Those were more or less close to sunlight in terms of their spectrum. I don't recall reading about any complaints or lack of acceptance. They never made it into homes because they were too expensive, too complex for lay people to operate, and probably not amenable to being scaled down enough for household use. After all, nobody wants a glaring 25,000 lumen lamp in their living room.
If there had existed a filament which remained solid at 5000K doubtless this would have been Edison's choice. The reason? Simple physics. A 5000K blackbody emits light far more efficiently than one at 2500K. Back then electricity was expensive compared to now. A bulb which might be a factor of 10 more efficient would have conquered the marketplace had it existed. If anyone objected to the light color, doubtless yellow filters would have been made available. Even with the filter, the bulb still would have been far more efficient than a 2500K bulb. Also, low CCT is a preference. In Asia 5000K is usually preferred in homes. In the end our eyes work best under sunlight, which varies from mid 4000s to low 5000s, depending upon the season or time of day.
And no, we're not adapted to fire. For starters, early man spent most of his day in daylight. Fire light was perhaps experienced for a few hours at night. For another, such evolutionary changes don't occur over tens or hundreds of millennia. They take much longer in an animal like humans with ~20 years or more between generations. There would also have to be a survival advantage to seeing better under fire light. There isn't because it's not widely prevalent in nature. In fact, most animals tend to flee fire, not adapt to see well under its light.