Monday, June 12, 2006

Astronomical Data - Part 2a: Reddening, Absorption, and Extinction

In the past few posts in this topic we learned that light is created when electrons fall down and have to give off their extra energy in some form. The light travels as a strange particlish wavy thing and then travels billions of miles to come twinkle in our night skies.

But for all that work, it's all in vain if we can't do something with it. So in this post we'll explore the history and challenges of detecting light from outside our planet.

The first challenge comes before the light even gets to our solar system. While for the most part, space is a damn near perfect vacuum (having only a few atoms per cubic kilometer), some areas are less vacuous than others.

Such things as nebulae (clouds of gas and dust) can obstruct the light of a star in its travel. Ultimately, there's not too much that can be done about this since it's hard to gain a clear picture of how big such a nebula is so that we can subtract out the effects.

However, we can recognize that the effects are present. Since most nebulae are made primarily of the most common element in the universe (hydrogen) we recognize that hydrogen will allow certain wavelengths to pass easier than others. The longer the wavelength, the less effect passing through these clouds will have. For the visible part of the spectrum, that means that red light will pass through relatively easily while blue light will be cut down. Thus, the star will appear redder. Not surprisingly, this effect is known as "interstellar reddening".

Depending on where we're looking in the galaxy, such nebulae may or may not be common. In our galaxy almost all nebulae are found in the plane of the galaxy which is that faint band of light in the sky known as the Milky Way. Therefore if we're looking at stars in this reason, we can expect that they will exhibit much more reddening than stars further from that plane.

These nebula become so thick in the Milky Way that after a relatively short distance, stars are completely blocked in the visible. Therefore, to "see" these stars, astronomers must use longer wavelengths such as infrared and radio.

However, much more importantly, there's another region of gas that's far more important: Our atmosphere.

Although our atmosphere is much thinner than any nebulae, its effects are much more pronounced because it is millions of times more dense and is constantly changing. Additionally, the chemical composition is very different. Unlike the rather simple dimming of hydrogen clouds, our atmosphere's blocking is much more complex as this diagram shows:



What this diagram is showing is how far different wavelengths can penetrate our atmosphere. What we can see is that short wavelength radiation like gamma rays and x-rays don't get through the atmosphere at all.

The rainbow you see at the top is the visible part of the spectrum. Just to the left, you'll notice that a tiny bit of radiation is able to get through. This is ultraviolet radiation. It's the stuff that gives us tans and causes cancer. It's a damn good thing our atmosphere does a pretty good job of blocking it.

However, on the other side of the visible spectrum, you'll notice things get much more complex. Just to the right is the infrared. Various parts come right through while others are blocked completely. This choppy pattern extends into the microwave as well.

The next big dip that all radiation can get through is in the radio part of the spectrum. This is the range in which radio and TV stations send their signals. Yet longer wavelengths radio waves are blocked.

So our atmosphere does some funky blocking stuff. In the visible part of the spectrum, we can see that even at its best, it's not quite 100% that gets through. Somewhere around 10% is still blocked. Thus, if we want to know how bright a star really is we're going to have to take this into account.

But this is only if you're looking straight up that that's how much light is blocked. And how often is a star straight up? This means that when worrying about how much light our atmosphere kills off, we have to consider where in the sky that object is at any given time.

You're probably familiar with just how much of an effect this is. Perhaps the most commonly seen, and also the most dramatic effect of this is the moon on the horizon:


Source

As with light passing through nebulae, our atmosphere tends to scatter blue light more (hence the reason the sky is blue). Thus, the moon looks redder the more atmosphere it has to go through.

This effect is known as "atmospheric extinction" and as you've seen it depends not only on the wavelength of light, but also on the amount of air it's having to go through. So that's all one big mess.

But wait! There's more!

As I mentioned briefly, our atmosphere is constantly changing. It's being blown around by all sorts of winds and acts as an ever changing lens. However, because the light was already a perfect point before entering the atmosphere, this ethereal lens can never improve upon that and thus, can only reduce the quality of the image by blurring it out.

Obviously, the more atmosphere light has to travel through, the more it's going to be blurred. How blurred something is, is called the "seeing" in astronomy. This (as well as the absorption mentioned previously) is why observatories are placed at high altitudes. Yet even at these locations, the best seeing is generally about half of an arcsecond. I assume most of you aren't familiar with an arc second so let me digress for a minute to explain:

If the entire sky, including that below the horizon is pictured as a sphere and divided into 360º, then each degree is divided into 60 more segments, these are called arcminutes. If each arcminute is divided into 60 more segments, then these are arcseconds. That means 1 arcsecond is 1/360 of 1º. That seems like it's still very tiny and there's no room to complain.

However, let's put that in perspective: If the moon is a half of a degree, that means it's 1800 arcseconds across. In my moderate size telescope (8" mirror) I can quite easily magnify the moon by 20 times. This means that my field of view is about 1/3 the width of the moon, or 600 arcseconds. So if the seeing was good enough as the .5 arcseconds figure, then any details larger than 1/1200 of that field of view would be visible. Not too bad.

However, only at some of the best observatories in the world (like the ones on Muana Kea) get this sort of benefit. In a normal suburb the seeing can often be much poorer due to lower altitudes, air currents due to the uneven cooling of cities, as well as a much higher amount of pollution. Thus, we can expect things to be blurred to a size of 10-20 arcseconds.

That means that in my magnified moon, anything larger than 1/60 to 1/30 the width of my field of view is going to be blurred. That's quite noticeable. That's why magnification in amateur telescopes is ultimately rather pointless. Due to the turbulence of the atmosphere, there is just a limit to how clear of an image you can get, no matter how big your telescope is.

So now we have light that's being dimmed by nebulae and our own atmosphere, and then blurred. Things aren't looking too great here for getting good data.

And that's only nature getting in the way. In my next post, I'll explain the difficulties in getting images that are due to the instrumentation as well as how they're compensated for.

No comments:

Post a Comment