Thread: Minimum brightness of star to damage eye from Earth-based telescope?

1. Member
Join Date
Jan 2008
Posts
98

Minimum brightness of star to damage eye from Earth-based telescope?

There are two motivations for my question. The "practical" application is the question of how bright a nova can be that amateur astronomers can regard the eye safety as being a non-issue for telescopic observations. The other is more fictional world-building than anything practical; e.g., what if a star like Rigel were only a dozen light-years from Earth? Would it still be safe to look at in a telescope? Or even with the naked eye?

2. I guess you're looking for something that can damage the retina faster than the blink reflex protects it. My best estimate for that is the luminance (surface brightness) of an atomic bomb explosion in the first 3ms, which is reportedly about 1010lx.sr-1.
The diffraction limit of the human eye is about 10-7steradians, so we'd need a star with an illumination of about 1000 lux. (For comparison, that's about a hundredth of direct sunlight, but as much light as you need for a workbench doing fine work.) Converting from illumination to apparent magnitude is not an exact conversion, but it's about

mv = -14 - 2.5log10(Ev)

which in this case comes out to be a star of about -21.5 apparent magnitude, which would damage your retina about as fast as an atomic bomb flash (lots of other things being equal, which they probably wouldn't be, but it's a ballpark figure).

The magnitude increment for a telescope is 5log10(Do/Dp), where Do is the objective diameter, and Dp is the pupil diameter--7mm is conventional.

So a ten-inch telescope (254m) gives a magnitude increment of 7.8 magnitudes, meaning you'd see a star of magnitude -13.7 to the naked eye as -21.5 through the telescope.

ETA: For comparison, using your example, moving Rigel 70 times closer makes it about 5000 times brighter, which is about 9 magnitudes, putting it about -9 apparent magnitude.

Grant Hutchison
Last edited by grant hutchison; 2020-Dec-06 at 10:52 PM.

3. Banned
Join Date
Feb 2005
Posts
12,154
Let’s hope something doesn’t go supernova as you look at it.
Unlikely? Yes...but two cars in an otherwise empty parking deck can still be the source of a fender bender

4. Originally Posted by publiusr
Let’s hope something doesn’t go supernova as you look at it.
Unlikely? Yes...but two cars in an otherwise empty parking deck can still be the source of a fender bender
But a supernova does not get instantly bright. It takes some time to go to its maximum brightness.

Here is an article that shows a light curve.

https://astronomy.swin.edu.au/cosmos...a+light+curves

5. Probably worth noting that my calculations above are for point sources--sources smaller than the diffraction limit of the naked eye, about one minute of arc. More extended sources don't increase their luminance as they get closer, or when viewed through a telescope--they just get brighter in proportion to their angular area, so their surface brightness remains the same. In practice, I don't think that makes a difference to the OP's scenario.

Another interesting matter arising is that the 1010lx.sr-1 threshold is only a few times brighter than the solar disc. Presumably there's some evolutionary advantage in having protective reflexes that make us blink or look away from the sun fast enough to avoid retinal damage, but no selection pressure to have reflexes any faster than that. Hotter stars than the sun have greater surface brightness, which passes the "atomic bomb" threshold early in spectral class A. So it's possible that if we got close enough to a star of class O or B, we'd incur retinal damage if we simply glanced in the direction of the stellar disc. (Larry Niven dealt with this sort of problem in the short story "Grendel", when he had visitors to Gummidgy wearing goggles that superimposed a black dot on the planet's sun, CY Aquarii.)

Grant Hutchison

6. Established Member
Join Date
Feb 2009
Posts
2,291
Originally Posted by Jens
But a supernova does not get instantly bright. It takes some time to go to its maximum brightness.

Here is an article that shows a light curve.

https://astronomy.swin.edu.au/cosmos...a+light+curves
But it does not show the lightcurve of first day, let alone first minute.
Clarke Burst was magnitude +5,8 at 7,5 million lightyears. That would make it as bright as Sun at what, 2400 lightyears? But I was not counting in the redshift.
The maximum lasted just 30 seconds. If you saw a gamma ray burst in Milky Way, how long before it illuminates a starry night?

Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•