Theoretical Astrophysical Observatory (TAO)
The Theoretical Astrophysical Observatory (TAO) houses queryable data from multiple popular cosmological simulations and galaxy formation models.Results can be funnelled through higher-level modules to build custom mock galaxy catalogues and images. TAO is accessible from anywhere you can access the internet.
TAO is part of the All-Sky Virtual Observatory (ASVO) and is funded and supported by Astronomy Australia Limited, the National eResearch Collaboration Tools and Resources (NeCTAR) , and Swinburne University of Technology.
For more details about TAO, refer to http://tao.asvo.org.au
GPU-Enabled, High-Resolution, cosmological MicroLensing parameter survey (GERLUMPH)
Present and future synoptic all-sky surveys are set to increase the number of known gravitationally lensed quasars from ∼100 to a few thousands, in the next decade. This will improve our understanding of quasar accretion discs and supermassive black holes through the effect of gravitational microlensing. GERLUMPH aims at improving our theoretical understanding and tools of microlensing, by producing thousands of high quality microlensing magnification maps. GERLUMPH is using the GPU-Supercomputer for Theoretical Astrophysics Research (gSTAR) to generate Terabytes of map data.
For more details about GERLUMPH, refer to http://gerlumph.swin.edu.au/
Tera-scale Astronomical Data Analysis and Visualization (GraphTIVA)
Within the high-performance computing field, the term “big data” has been used to describe datasets too large to be handled with on-hand analysis, processing, and visualization tools. It is anticipated that the ability to perform these fundamental tasks will become a key basis for competition and science discoveries within the near future. Recent advances in astronomical observing and simulation facilities are expected to move astronomy toward a new data-intensive era where such “big data” is the norm rather than an exception.
To enable knowledge discovery in this new era, we designed and built the Tera-scale interactive visualization and data analysis framework (GraphTIVA). GraphTIVA is a high-performance, graphics processing unit (GPU)-based framework for the efficient analysis and visualization of Tera-scale 3-dimensional data cubes.
Using a cluster of 96 GPUs, we demonstrate for a 0.5 TB image: (1) volume rendering using an arbitrary transfer function at 7–10 frames per second; (2) computation of basic global image statistics such as the mean intensity and standard deviation in 1.7 s; (3) evaluation of the image histogram in 4 s; and (4) evaluation of the global image median intensity in just 45 s. These results correspond to a raw computational throughput approaching one teravoxel per second, and are 10–100 times faster than the best possible performance with traditional single-node, multi-core CPU implementations.
A scalability analysis shows the framework will scale well to images sized 1 TB and beyond. Other parallel data analysis algorithms can be added to the framework with relative ease, and accordingly, we present our framework as a possible solution to the image analysis and visualization requirements of next-generation telescopes, including the forthcoming Square Kilometre Array pathfinder radiotelescopes.
For more details about GRAPHTIVA, refer to Amr Hassan’s Thesis
Swinburne Pulsar Portal: real-time supercomputing processing of “big data”
The successful management and utilisation of “big data” is not just an issue for those concerned with data mining the substantial troves of the likes of Google and Amazon. Radio astronomers also mine the sky for rare “needles in astronomical haystacks”, leading to further discoveries of known object and potentially revealing new, completely unknown astrophysical phenomena.
The Swinburne Pulsar Portal has been created as an extension of the Swinburne University of Technology Metadata Stores Project, funded in part by the Australian National Data Service (ANDS) and driven by the global research culture within the astronomy disciplines to download and analyse extremely large datasets. The portal will provide an online tool facilitating remote access to and processing of pulsar data from the CSIRO Parkes 64m Radio Telescope, maintained on the gSTAR Supercomputer. Unlike traditional data archives, the Swinburne Pulsar Portal leverages the power of the supercomputer to enable users to conduct sophisticated analyses using advanced computational tools, backed by significant processing power, within a dedicated interface. This setup alleviates the guesswork associated with setting up and maintaining the necessary hardware and software infrastructure.
For more details contact Prof. Matthew Bailes
GBKFIT is an open source high-performance software for galaxy kinematic modelling. It can be used to extract morphological and kinematical properties of galaxies by fitting models to spatially resolved kinematic data. The software can also take beam smearing into acount by using the knowledge of the line and point spread functions.
One of GBKFIT’s major strengths is its ability to take advantage of many-core and massively parallel architectures, such as multi-core CPUs and Graphics Processing Units (GPUs). This makes it suitable for modelling large-scale surveys of thousands of galaxies within a very seasonable time frame.
GBKFIT features an extensible object-oriented architecture which allows for support of arbitrary models and optimization techniques in the form of modules. One can write his/her own modules without modifying GBKFIT’s source code.
The software is written in C++ and strictly conforms to the latest ISO standards. It supports all the major operating systems (Linux, Windows, Mac OS) and uses the CMake build system to provide a common and unified compilation and installation procedure.
For more details about GBKFIT, refer to http://supercomputing.swin.edu.au/gbkfit
KERLUMPH is a lossy JPEG2000 Data Compression software for magnification maps. Recent work evaluating the JPEG2000 (ISO/IEC 15444) standards as a future data format standard in astronomy has shown promising results on observational data. KERLUMPH has been developed to evaluate the standards suitability for data generated in numerical simulation — using GERLUMPH data as a case study. It enables the conversion of magnification maps — pixelated versions of the caustic pattern in the background source plane created by the foreground microlenses — into a compressed JPEG2000 file, and the inversion of the process in order to enable work with raw-like data. Our case study results suggest that JPEG2000 could also be suitable for other numerical datasets. In particular, well suited candidate are likely to be data structure such as gridded data or volumetric data. In our case study, lossy compression and a high compression ratio do not significantly compromise the intended use of the data for constraining quasar source profiles from cosmological microlensing.
For more details about KERLUMPH, refer to http://supercomputing.swin.edu.au/projects/kerlumph/