Components of the Milky Way and GAIA
ArXiv astro-ph/0109118 (2001)
Abstract:
The GAIA mission will produce an extraordinary database from which we should be able to deduce not only the Galaxy's current structure, but also much of its history, and thus cast a powerful light on the way in which galaxies in general are made up of components, and of how these formed. The database can be fully exploited only by fitting to it a sophisticated model of the entire Galaxy. Steady-state models are of fundamental importance even though the Galaxy cannot be in a steady state. A very elaborate model of the Galaxy will be required to reproduce the great wealth of detail that GAIA will reveal. A systematic approach to model-building will be required if such a model is to be successfully constructed, however. The natural strategy is to proceed through a series of models of ever increasing elaborateness, and to be guided in the specification of the next model by mismatches between the data and the current model. An approach to the dynamics of systems with steady gravitational potentials that we call the `torus programme' promises to provide an appropriate framework within which to carry out the proposed modelling programme. The basic principles of this approach have been worked out in some detail and are summarized here. Some extensions will be required before the GAIA database can be successfully confronted. Other modelling techniques that might be employed are briefly examined.Multi-level adaptive particle mesh (MLAPM): A c code for cosmological simulations
Monthly Notices of the Royal Astronomical Society 325:2 (2001) 845-864
Abstract:
We present a computer code written in c that is designed to simulate structure formation from collisionless matter. The code is purely grid-based and uses a recursively refined Cartesian grid to solve Poisson's equation for the potential, rather than obtaining the potential from a Green's function. Refinements can have arbitrary shapes and in practice closely follow the complex morphology of the density field that evolves. The time-step shortens by a factor of 2 with each successive refinement. Competing approaches to N-body simulation are discussed from the point of view of the basic theory of N-body simulation. It is argued that an appropriate choice of softening length ∈ is of great importance and that ∈ should be at all points an appropriate multiple of the local interparticle separation. Unlike tree and P3M codes, multigrid codes automatically satisfy this requirement. We show that at early times and low densities in cosmological simulations, ∈ needs to be significantly smaller relative to the interparticle separation than in virialized regions. Tests of the ability of the code's Poisson solver to recover the gravitational fields of both virialized haloes and Zel'dovich waves are presented, as are tests of the code's ability to reproduce analytic solutions for plane-wave evolution. The times required to conduct a ACDM cosmological simulation for various configurations are compared with the times required to complete the same simulation with the ART, AP3M and GADGET codes. The power spectra, halo mass functions and halo-halo correlation functions of simulations conducted with different codes are compared. The code is available from http://www-thphys.physics.ox.ac.uk/users/MLAPM.Kinematics from spectroscopy with a wide slit: Detecting black holes in galaxy centres
Monthly Notices of the Royal Astronomical Society 323:4 (2001) 831-838
Abstract:
We consider long-slit emission-line spectra of galactic nuclei when the slit is wider than the instrumental point spread function, and the target has large velocity gradients. The finite width of the slit generates complex distributions of brightness at a given spatial point in the measured spectrum, which can be misinterpreted as coming from additional physically distinct nuclear components. We illustrate this phenomenon for the case of a thin disc in circular motion around a nuclear black hole (BH). We develop a new method for estimating the mass of the BH that exploits a feature in the spectrum at the outer edge of the BH's sphere of influence, and therefore gives higher sensitivity to BH detection than traditional methods. Moreover, with this method we can determine the BH mass and the inclination of the surrounding disc separately, whereas the traditional approach to BH estimation requires two long-slit spectra to be taken. We show that, with a given spectrograph, the detectability of a BH depends on the sense of rotation of the nuclear disc. We apply our method to estimate the BH mass in M84 from a publicly available spectrum, and recover a value four times lower than that published previously from the same data.Two-Body Relaxation in Cosmological Simulations
ArXiv astro-ph/0105183 (2001)
Abstract:
It is logically possible that early two-body relaxation in simulations of cosmological clustering influences the final structure of massive clusters. Convergence studies in which mass and spatial resolution are simultaneously increased, cannot eliminate this possibility. We test the importance of two-body relaxation in cosmological simulations with simulations in which there are two species of particles. The cases of two mass ratios, sqrt(2):1 and 4:1, are investigated. Simulations are run with both a spatially fixed softening length and adaptive softening using the publicly available codes GADGET and MLAPM, respectively. The effects of two-body relaxation are detected in both the density profiles of halos and the mass function of halos. The effects are more pronounced with a fixed softening length, but even in this case they are not so large as to suggest that results obtained with one mass species are significantly affected by two-body relaxation. The simulations that use adaptive softening are less affected by two-body relaxation and produce slightly higher central densities in the largest halos. They run about three times faster than the simulations that use a fixed softening length.MLAPM - a C code for cosmological simulations
ArXiv astro-ph/0103503 (2001)