Imagine the Universe!
Imagine Home  |   Ask A High Energy Astronomer  |  
Ask a High-Energy Astronomer

The Question

(Submitted April 20, 1998)

What is the error factor in the distances to galaxies? For instance, the Andromeda galaxy (M31) is 2.8 million light years. How accurate is that? Is the error plus or minus 1 million light years or what? And does that error factor get much worse the farther out we go?

The Answer

Nearby galaxies, where you can see individual stars, probably have about a 10% distance uncertainty.

For galaxies at distances where you can't see individual stars, the distance is found by multiplying the redshift by a number called the Hubble Constant H0. The value of H0 is a contentious issue, but the two extreme camps are arguing for ~55 or ~70 km/s/Mpc, which means that there is about a 30% range in how far away people think any given galaxy is. The accuracy with which the redshift is measured doesn't depend much on the distance to the galaxy, so this is a constant factor: you know that one galaxy is 3.0 times as far away as another, and not 3.1 or 2.9 times.

At distances which are a good fraction of the age of the universe away, the question is how constant the Hubble constant is. Depending on how much mass there is in the universe (and thus how much the universe has been slowed down by gravity--so how much faster it was expanding in the early days) this can add an additional uncertainty range of 50%.

David Palmer
for Ask a High-Energy Astronomer

Previous question
Prev
Main topic
Main
Next question
Next
Imagine the Universe is a service of the High Energy Astrophysics Science Archive Research Center (HEASARC), Dr. Nicholas White (Director), within the Laboratory for High Energy Astrophysics at NASA's Goddard Space Flight Center.

The Imagine Team
Project Leader: Dr. Jim Lochner
All material on this site has been created and updated between 1997-2004.

CD Table of Contents