ASTEROSEISMOLOGY METACOMPUTER ($25,000)

R. E. Nather and Travis Metcalfe

We study white dwarf stars because they are the end-points of stellar evolution for the majority of all stars, and their composition and structure can tell us about their prior history. We can determine the internal structure of variable white dwarfs by observing their variations in brightness as a function of time, using the techniques of high speed photometry to define their light curves, and then matching these observations with a computer model which behaves the same way. The parameters of the models are chosen to correspond one-to-one with the physical processes that give rise to the variations, so a good fit to the data gives us confidence that our model reflects the actual physics of the stars themselves.

Although this procedure is simple in outline, its realization in practice requires specialized instrumentation to overcome the practical difficulties we encounter in the process. We have developed the Whole Earth Telescope (WET) observing network to provide the 200 or more hours of essentially gap-free data we require for the analysis. This instrument is now mature, and has provided a wealth of seismological data on the different varieties of variable white dwarf stars, so the observational part of the procedure is well in hand. We need to improve our analytical procedures to take full advantage of the possibilities afforded by asteroseismology, and we propose to construct a specialized instrument to do so.

Our computer models of white dwarf stars have, of necessity, many parameters representing the internal physics of the objects, most of which are not independent of the others, and finding a proper set of them to provide a close fit to the observed data is difficult. The current procedure is a cut-and-try process guided by intuition and experience, and is far more subjective than we would like. More objective procedures are essential if asteroseismology is to become a widely-accepted astronomical technique. We must be able to demonstrate that, within the range of different values the model parameters can assume, we have found the only solution, or the best one if more than one is possible. We plan to apply a search-and-fit technique employing genetic algorithms, which can explore all of the myriad parameter combinations possible and select for us the best one (or ones). This basic technique has been applied in other disciplines but has not been used much in astronomy, to date. Travis Metcalfe has applied the procedure with success to his 2nd-year research project (fitting parameters of a model to his observations of the variations in a W UMa star), and plans to apply it to the white dwarf observations as a part of his doctoral thesis.

Although extremely effective (and objective) in their application, genetic algorithms require a very large amount of computer time---far more than is readily available with our current computing facilities. We therefore propose to assemble a metacomputer---a collection of minimal PCs connected by ethernet that can work in parallel to solve our problem in a reasonable time. Systems of this type have been assembled at Los Alamos and at Cal Tech, so we can scale from their results. We expect our proposed assembly to be capable of 1,000 to 2,000 Megaflops (million floating-point operations per second). The software to operate the computer system (Linux, PVM) is free, and is already working on our departmental computers.

This computing facility, once assembled and working, will be made available to other members of the department who have problems which can be easily segmented and solved by a system of cooperating, parallel computers.