A remarkable story of discovery
How far can you see? The question is not so straightforward. Anyone can glance up at the night sky and see starlight that originated thousands of light years away, but our brains' default software truncates everything to the average horizon of some twenty miles. Consider that monkeys, birds, and insects can see those same stars-- do you think they are capable of seeing thousands of light years? If not, what is the alternative?
I contend that the origin of sensory input is not the decisive factor. When I look at the full moon, I see only reflected sunlight; am I therefore looking at the sun, some 93 million miles away, or am I seeing the moon, some 485000 miles away? Better yet, take a gander at this and tell me how far away you're seeing:
That's a NASA artist's rendition of an exoplanet, ergo, I think you'd agree that you're seeing something about two feet away (your monitor). Now imagine that this is a physical drawing and you're seeing it on someone's desk via NASA's live webcam-- what's your distance measure now? You might say that it's still a mere two feet, because the light that strikes your eyes came from your monitor. That's a boring answer, and not very consistent with common usage. Reference the previous paragraph about the sun and the moon. Look at your keyboard and tell me if you're looking at your keyboard (like I told you to do), or if you're looking at a distorted and darkened image of your light bulb (whence the photons originated).
In fact, every bit of light that reaches your eyes has been distorted, translated, and generally fuzzed up. And it achieves this without the benefit of technology. Photons get doppler-shifted, they interact with our atmosphere, and they do funky stuff that's remniscent of a hypothetical elastic collision between equal-mass billiard balls.
And when that happens, you can't tell whether one photon generated another identical photon or simply passed on unimpeded, and, if I understand my quantum mechanics, it is actually incorrect to say that there is a difference. What you see really depends on the information, however transmitted, and therefore depends on your knowledge.
Irritated yet? Okay, I'll get on with the story.
When I was a wee lad of about ten (and I was pretty wee, so I probably looked more like a lad of seven), I had learned enough to know that the stars and stuff were really, really far away. But I also knew there was no way I could see for trillions of miles. Sure enough, when I looked up at the night sky, I perceived twinkling lights some tens of miles away. And that was it. I wondered how those scientists did it. Many of you may still wonder. I figured it had something to do with telescopes, or maybe spaceships. Surely, when those guys flew to the moon, they had a better view of things and the distances were much more obvious.
So I asked for a telescope for Christmas. My parents obliged with the best that a non-millionaire could afford, which gave me something slightly better than ordinary binoculars. When I looked up at the stars and planets I was rather disappointed to learn that everything looked almost exactly the same.* I did have some fun looking at the moon, and distant landscapes, but I figured the telescope simply wasn't big enough. So I shifted my dreams to plan B, which involved becoming an astronaut and/or building my own spaceship.
Soon after that, I noticed girls, and couldn't think about anything else for a very long time.
My journey of discovery resumed in college, when I took a course in astronomy. But here's the remarkable thing: I was scarcely aware of it at the time, because I was following the same journey that had begun thousands of years ago with curious-minded folk the world over. What's more, it had nothing to do with spaceships, and far less to do with telescopes than I ever expected.
One such journeyman was Nicolaus Copernicus. He watched the stars go round and round in predictable fashion, as did the sun and moon, while a few oddballs among the group liked to stop and go backwards every now and then. Nick didn't like that one bit.
The mystics, we should note, were more than happy to attribute these oddities to supernatural intervention; just as mystics today are content to describe odd noises as ghosts and strange lights as alien visitors. Persistent anthropomorphisation of unknown phenomena is the hallmark of bad imagination.
Anyway, Nick was rigid and imaginative, and insisted that something else must be going on. Thus he came up with the heliocentric model, with Earth and the planets going in circles around the sun at various distances. Not only was this model simpler, but it paved the way for the first calulations of interplanetary distances. Shockingly, until then, mankind had absolutely no clue as to how far away the sun was. But why would they? You look at it (or near it, don't burn your eyes) and have a guess. But by knowing that Venus was between the sun and the Earth, and knowing they both moved in (roughly) circular orbits, a transit of Venus could now be used to measure the distance to the sun.
Thus armed with knowledge, we can now look at the sun (again, make it a cursory glance) and see something that's 93 million miles away. That's over 4 million times further than our most primitive ancestors could see. Wow. Let's stop and give a toast to dear Copernicus, and to Jeremiah Horrocks, who made the first good measurement in 1639. Well done!
But of course, we're not done. How 'bout them stars? They're reeeeeaaalllllly far away. But now that we know the distance to the sun, we can use a trick called parallax:
If that diagram confuses you, step outside and look at your neighbor's chimney against the distant horizon. Now walk sideways. See how the chimney moves relative to the hills? The closer your neighbor's house, the "faster" the apparent motion of the chimney. Likewise, but observing nearby stars at six-month intervals (when the Earth has moved "sideways" twice its distance to the sun), we can compare their position relative to more distant background stars. Friedrich Wilhelm Bessel first did this in 1838, measuring the distance to 61 Cygni at around three and half parsecs, or some 67 trillion miles. Earth-based parallax measurements are effective for stars up to about 100 parsecs away, or 1900 trillion miles. Now we can see 20 million times further than Jeremiah Horrocks. And the best is yet to come.
With a catalog of several thousand stars at measurable distances, astronomers can do all kinds of tricks with spectral analysis (looking at the color of starlight) and measures of luminosity (how bright the stars are). One good trick is to make a chart:
The height (y-axis to my fellow geeks) shows how much light each star puts out. They can figure that out by comparing the apparent magnitude (how bright it looks from here) and the known distance (taken from parallax). The bottom of the chart shows the spectral class, and further spectral analysis can take a star's temperature, which pretty well narrows down what type of star you're looking at.
See the trick that's coming? Armed with this information, you can look at a star of unknown distance, determine what type of star it is, and therefore how much light it puts out. By comparing that absolute magnitude to the apparent magnitude, you can calculate the distance. You probably feel drunk with power by now, so we'll do just one more step.
At much greater distances, outside our own galaxy, it's difficult (if not impossible) to observe individual stars. Unless they blow up, which can happen in one of several ways.
Regrettably, stars eventually die. When that happens, if they're more than three times the mass of our own sun, they can collapse into a neutron star or even a black hole. If they're less massive than our sun, they aren't dead yet (because they universe is only 14 billion years old, silly). Stars in between those sizes become white dwarfs, and after blowing off excess mass in their death throes they are all remarkably similar.
Now, many stars in our universe are part of a binary system. Ergo, many white dwarfs are paired with neighboring stars. As you probably guessed, then, the smaller white dwarf that isn't putting out any stellar wind is likely to accumulate matter if its companion is sufficiently nearby:
If the white dwarf accumulates enough mass, it triggers carbon fusion and explodes again. And the really cool part here is that "enough mass" is the same wherever you go. Any such white dwarf that blows up is going to do so with the same mass, no matter how long it takes to build up to it, and thus they produce nearly identical explosions. These are called a Type Ia Supernova. So: We've already measured the distance to a lot of stars within our own galaxy and nearby companions; and we've seen a few of them go up as Type Ia Supernovae, producing an exceptionally distinct light curve (spectral analysis again) and having the same "brightness" (absolute magnitude) every single time. Once again, we observe a more distant object (Type Ia Supernova in another galaxy) and compare that absolute magnitude with the apparent magnitude to determine the distance. We might need a telescope, but at this point we can see objects as far away as 1000 Megaparsecs; which is about 19 million trillion miles; which is ten million times further than we could see using parallax; which is roughly 1,000,000,000,000,000,000,000 (a sextillion, if you like) as far as can be seen with the naked brain.
Of course, all of that is a mere outline of some of the major techniques. Like all good science, astrometrics involves plenty of corroborating evidence. But at the least, I hope this shot of Supernova 2006dd in the galaxy NGC 1316, some 70 million light years away, looks like more than a haze of pixels to you now.
I can see for miles and miles and miles and miles and miiiiiiiiiiiiiiiiiiiiles....
* If you know someone with a low-powered telescope and astronomical curiosity, I recommend looking at Jupiter. Even with a pair of binoculars, it can be seen as a small disc surround by pinpoints of light which are the Galilean moons.