The Question
(Submitted January 25, 2001)
From what I've read, the elements heavier than Iron in atomic weight can
only be created by Supernovae. Similarly, elements beyond Hydrogen,
Helium and Lithium can only be created in the core of stars during the
process of their lives.
The constituent elements of stars including our Sun can be determined by
spectrographic analysis. I believe that our Sun, and many other stars
already have heavy elements within them which indicates that they were,
at least in part, formed by material from ancient Supernovae. I believe
spectrographic analysis can also be used to determine the overall
distribution of matter in whole galaxies just like it is used on stars
(I'm speculating this, since I've never heard it before, but it seems
logical).
Since Supernovae, and stellar evolution occur over time, there should be
more heavy elements now (at our present point in cosmic history), than
there were near the beginning. So what do we see when we look back in
time at distant galaxies.
My questions are:
If we do spectrographic analysis on galaxies from oldest to youngest,
does a change in the distribution of heavy elements show up?
If so, what does it tell us about the rate at which Supernovae occur on
average?
How many Supernovae per galaxy per century are required to account for
the current distribution of heavy elements found in stars?
The Answer
First, it does not require a supernova to create elements heavier
than iron. Heavy elements can also form in the cores of massive stars
before they go supernova (s-process isotopes). Secondly, some elements
beyond helium are formed in planetary nebulae. Some can also be formed
through cosmic ray collisions. So the picture is a bit more complicated.
Now for your questions. Galaxy metallicity (the fraction of heavy
elements) can be derived from emission line spectroscopy of planetary
nebulae and H-II regions in nearby spirals and irregulars, and from
absorption line spectroscopy of large ensembles of stars in elliptical
galaxies. The metallicities of galaxies depends on the star formation
history (how many generations of supernova producing stars) and whether the
newly synthesized metals can be retained. The latter depends mostly on mass
(i.e., gravitational binding energy) --- low mass galaxies lose a lot of
metals in galactic winds.
On getting metallicity of distant galaxies from optical spectroscopy: it's
a much tougher thing than getting the metallicity of a star. For a star,
you solve for temperature, gravity, and chemical abundances. But a galaxy
has a population of stars of different temperatures, gravities, and
chemical abundances (from old M dwarfs that have been around for billions
of years to O stars that were borne yesterday). You need a pretty good idea
of the stellar populations before you can make any inferences on the
metallicity.
Another problem: you need high resolution, high signal-to-noise
spectra. Cosmologically distant galaxies (the ones which may show a
significant difference) are all too faint (at least they were, before the
days of 10m class telescopes).
For a typical stellar population one supernova is produced for
every 100 solar masses of stars formed --- this can give a rough idea of the
rate in a galaxy of a given age and mass (in a galaxy like the Milky Way,
1-10 per century or so). The rate of Supernovae is probably not constant
over the age of the galaxy, and also is not uniform across the width of the
galaxy. Metallicity increases the closer you get to galactic center, due to
the higher density of stars.
Cheers,
Hans Krimm, Bram Boroson, Eric Christian, Kazunori Ishibash, Mike
Loewenstein, and Koji Mukai for "Ask an Astrophysicist"
|