LatrobeCurl2-s_edited.jpg

Research

 

Commissioned Reports

  • Manos, Steven, Spear, Kieran, Martinez, Paula A., Quenette, Steve, Gray, Mark, White, Andrew, & ARCOS Working Group. (2020). A National View of Containers and Kubernetes in Research: Shared challenges and opportunities. http://doi.org/10.5281/zenodo.4083297

    • This report has been developed through the ​ARDC Community of Practice - ​Australian Research Container Orchestration Service (ARCOS), in response to the roadmap listed below. This report summarises the findings and challenges expressed through engagements and a survey conducted with various Australian stakeholders. The stakeholders are diverse, and applications straddle those targeting HPC facilities, cloud-native workflows, and the emerging edge workflows. It provides recommendations on how to lift the sector by coordinating a co-operated Kubernetes capability.

  • ​Soo, Ai-Lin, Janke, Andrew, Betbeder-Matibet, Luc, Francis, Rhys, Giugni, Stephen, & Quenette, Steve. (2020, June 1). Research Data Culture Conversation - Paper 1 "A Summary of the Challenge". Zenodo. http://doi.org/10.5281/zenodo.3887434

    • Research data is a cross-organisational, multi-responsibility activity supported by many functions within a university, including research itself together with support ‘pillars’ including archives, data privacy, eResearch, IT, library, records and the research office. The “Research Data Culture Conversation” (RDCC) is a conversation between representatives of these pillars from Australian universities, focused on understanding the challenges faced in the current data generation climate. The results of these discussions over a 2 year consultation are summarised in this report.

  • ​Soo, Ai-Lin, Janke, Andrew, Betbeder-Matibet, Luc, Francis, Rhys, Giugni, Stephen, & Quenette, Steve. (2020, June 1). Research Data Culture Conversation - Paper 2 "A Development Response". Zenodo. http://doi.org/10.5281/zenodo.3887399

    • The “Research Data Culture Conversation” (RDCC) is a conversation across the spectrum of pillars within Australian universities, focused on understanding the challenges faced in the current data generation climate. This report consolidates the findings, and provides some recommendations - such as: research data management plans must inform decision making and must deliver to productivity gains; and institutions, which are a primary location where ambitions and obligations meet budgets, need nationally-coherent discipline-sensitive response to be defined.

  • ​Quenette, Steve, Coddington, Paul, Gray, Mark, Manos, Steven, & Fraser, Ryan. (2020). ARCOS establishment roadmap (Version 1.2). http://doi.org/10.5281/zenodo.3990834

    • A roadmap that seeks to emulate the transformational impact to organisations and researchers experienced through the almost 10 years of the ARDC (national OpenStack) Core Services competency, by establishing a Kubernetes Core Services. The new capability will take the form of a national team of experts in Kubernetes. It will support related open source software for cloud native computing, engage with the international Kubernetes & Cloud Native Computing communities to ensure Kubernetes is developed to meet Australia's needs, establish an innovation and operational best practices amongst the national stakeholders, and co-operate on developing part of the national technical eResearch ecosystem.

  • Quenette. (2019, December 5). Global experiences digital research infrastructure federations. Zenodo. http://doi.org/10.5281/zenodo.3563246

    • The national OpenStack federation (the ARDC Nectar / Australian Research Cloud + Core Services) has been an outstanding success. It is both a horizontal federation and a global innovator.  The purpose of this work was to amass evidence from a spectrum of organisations including universities, research organisations, and communities  to determine: the meaning of digital federations, the role of digital federations, and the relationship between federation, capacity, governance and impact.

Selected invited presentations

  • Passing on Reinventing the Wheel: Developing methods and reducing barriers to inter- and intra-disciplinary research output translation, 2020 ESIP Summer Meeting Plenary – panellist. Jane Wyngaard, Lindsay Barbieri, Jens Klump, Peter Webley, Steve Quenette, Chris Jack, Anthony Arendt. 

  • Global experiences with digital research infrastructure federations, ARDC Storage and Compute Infrastructure Summit, Brisbane 2019

  • Institutional Directions, ARDC Data and Services Summit, Brisbane 2019

  • Climate change, Brain and Imaging Research on Openstack (Keynote), OpenStack Summit, Sydney 2017

  • Ceph as Monash University’s research data engine, Red Hat summit, Boston 2017

  • Visualisation: from Cloud to CAVE, O. Kaluza, S. Quenette, International Conference on Seismic Imaging, Inversion & Visualization, Haikou, Hainan Island, China, January 2017

  • Cloud Infrastructure to Help Researchers Build 21st Century Microscopes, OpenStack Summit, Barcelona, 2016

  • An environment for 21st century microscopes (Keynote). Dell Edu CIO seminar, Chengdu China, 2016

  • Scale and performance: Servicing the Fabric and the Workshop. CSIRO CSS & eResearch Annual Conference, Melbourne 2016

  • Software, complexity & reuse... Robust-to-experimental codes for geodynamics. Computational Infrastructure in Geodynamics – Strategic planning workshop, Pasadena USA, 2009

  • An approach to software composability, and its importance to enabling multiscale - multiphysics geodynamics, University of Minnesota, USA, 2008

  • A roles-based approach to enabling multi-scale, multi-physics computational geophysics codes, ExxonMobil Corporate Strategic Research, New Jersey, USA, 2006

Awarded grants

  • Computational capability of SAM maintaining Underworld and associated ecosystems. Moresi, Louis N. (PCI), Quenette, Steve (CI), Mansour, John (CI), Capitanio, Fabio (CI), Farrington, Rebecca (CI). 1/07/20 → 30/06/22. $954,472 award

  • Auscope partnership in the Australian Scalable Drones Cloud (ASDC). Quenette, Steve (PCI), Clarke, Rohan (CI), Glasgow, Robert (CI), Kaluza, Owen (AI). 1/07/20 → 30/06/22. $123,334 award

  • Establishing Australia’s Scalable Drone Cloud (ASDC). Quenette, Steve (PCI), Clarke, Rohan (CI), Rawling, Timothy (CI), Guru, Siddeswara (CI), Klump, Jens (CI), and Brown, Tim (CI). ARDC Platform. 3/02/20 → 3/02/23. $855,000 award + $855,000 cash co-investment

  • 2nd Generation of the ARDC Nectar Research Cloud at Monash.  Quenette, Steve (PCI), Aung, Swe Win (CI), Soo, Ai-Lin (CI), Goscinski, Wojtek (CI), Bonnington, Paul (CI), and Revote, Jerico (CI). 1/01/20 → 31/12/22. $1,022,000.00 award + $678,000 cash co-contribution

  • Role of Institutions in a National Data Commons. Quenette, Steve (PCI), Groenewegen, David  (CI), Ennor, Sandra (CI), Dart, Stephen  (CI), Kannan, and Anitha (CI). 1/09/19 → 31/01/20. $50,000 award.
  • Global experiences digital research infrastructure federations. Quenette, Steve (PCI), Goscinski, Wojtek (CI), Bonnington, Paul (CI). 1/07/19 → 31/10/19. $50,000 award. 

  • Research Cloud Core Services - WP01A. Manos, Steve (PCI), Botten, Lindsay Charles (CI), and Quenette, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/21. $6,258,378 award

  • Research Cloud National Neutron upgrade - WP01B. Quenette, Steve (PCI), Bethwaite, Blair (CI), Aung, Swe Win (CI), Revote, Jerico (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $240,000 award

  • Research Cloud National Memory expansion - WP01C. Quenette, Steve (PCI), Bethwaite, Blair (CI), Aung, Swe Win (CI), Revote, Jerico (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $512,507 award

  • Research Cloud Monitoring & Reporting - WP02. Botten, Lindsay Charles (CI), Bindoff, Nathaniel Lee (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), Quenette, Steve (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $376,000 award

  • Research Cloud Security Monitoring & Incident Response - WP03. Gibson, Ian (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Hobson, Mary (CI), Quenette, Steve (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $228,000 award

  • Research Cloud Quality Assurance of Nectar VMs & Reference Stacks - WP04. Quenette, Steve (CI), and Bethwaite, Blair (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $98,000 award

  • Research Cloud Continuous Improvement - WP05. Gibson, Ian (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Hobson, Mary (CI), Quenette, Steve (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $114,000 award

  • Research Cloud User Support & Distributed Help Desk - WP06. Cook, Rob (PCI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Gibson, Ian (CI), Hobson, Mary (CI), Manos, Steve (CI), Quenette, Steve (CI), and Stringfellow, Neil (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/21. $2,791,422 award

  • Research Cloud Application Orchestration - WP07A. Manos, Steve (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Quenette, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $136,000 award

  • Research Cloud Trove - Database as a Service - WP07B. Quenette, Steve (PCI), Bethwaite, Blair (CI), Revote, Jerico (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $136,650 award

  • Research Cloud Storage Services - WP07C. Botten, Lindsay Charles (CI), Bindoff, Nathaniel Lee (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), Quenette, Steve (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $157,250 award

  • Research Cloud Elastic Data Service - WP07.5. Botten, Lindsay Charles (CI), Bindoff, Nathaniel Lee (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), Quenette, Steve (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $86,000 award

  • Research Cloud Resource Allocation process & Management System - WP08 (CRAMS). Quenette, Steve (PCI), Barney, Sebastian (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Manos, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $192,000 award

  • Research Cloud National Server Program cloudification - WP11. Manos, Steve (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), and Quenette, Steve (CI). NeCTAR Research Cloud Surge of Investment. 1/07/14 → 30/06/15. $135,000 award

  • NeCTAR Research Cloud at Monash. Quenette, Steve (PCI), Bonnington, Paul (CI), Bethwaite, Blair (CI), Aung, Swe Win (CI), Revote, Jerico (CI), Bindoff, Nathaniel Lee (CI), Botten, Lindsay Charles (CI), Cook, Rob (CI), Gibson, Ian (CI), Hobson, Mary (CI), and Manos, Steve (CI). NeCTAR Research Cloud. 6/06/12 → 30/06/21. $7,993,995 award + $2144,475 cash co-investment

Plus numerous others where I am not the Primary Chief Investigator or yet to enter here.

Other workshops, reports and white papers

  • Francis, Rhys; Soo, Ai-Lin; Quenette, Steve (2020): Research Data Culture Conversation 2019 National Meeting Presentation. Monash University. Presentation. https://doi.org/10.26180/13088987 

  • Quenette, Steve; Rawling, Timothy; Musson, Alex; Sandiford, Mike (2010): AuScope Geothermal Demonstrators - Latrobe Valley, Victoria, Australia. Monash University. Report. https://doi.org/10.26180/5f4dc2c4e94e0

  • Quenette, Steve; O'Neill, Craig (2010): AuScope Geothermal Demonstrators - Gunnedah, New South Wales, Australia. Monash University. Report. https://doi.org/10.26180/5f4dc6d0a47b3

  • Quenette, Steve; Kirkby, Alison (2010): AuScope Geothermal Demonstrators - Cooper Basin, South Australia / Queensland, Australia. Monash University. Report. https://doi.org/10.26180/5f4dd65272327

Plus numerous others yet to be entered here.

Published open source software

  • Amarapathy, Samitha; Mohamed Feroze, Rafi; Luong, Melvin; Yu, Simon; Quenette, Steve; Dart, Stephen; et al. (2021): Cloud Resource Allocation Management System (CRAMS) - Data Dashboard release. Monash University. Software. https://doi.org/10.26180/14379101 / https://github.com/CRAMS-Dashboard/crams  

    • The Cloud Resource Allocation Management System (CRAMS) provides self-service to resource requests, approvals, instantiation and utilisation across data storage, High Performance Computing Platform (HPC) and Research Cloud. CRAMS provides an effective mechanism for researchers, research facilities and universities to manage and monitor usage. It provides facilities and universities with aggregate demand and usage information, as well as integrates into many workflows involving the research data lifecycle and compliance.

  • John Mansour, Julian Giordani, Louis Moresi, Romain Beucher, Owen Kaluza, Mirko Velic, Rebecca Farrington, Steve Quenette, Adam Beall. (2020, February 10). underworld2 (Version v2.9.0b). Zenodo. http://doi.org/10.5281/zenodo.3661465 / https://github.com/underworldcode/underworld2  

    • Underworld 2 is a Python API (Application Programming Interface) which provides functionality for the modelling of geodynamics processes, and is designed to work (almost) seamlessly across PC, cloud and HPC infrastructure. It enables modelling 2- and 3-dimensional geodynamics processes, utilising a particle-in-cell finite element approach for solution to Stokes flow type configurations. In Underworld, the finite element mesh can be static or dynamic, but it is not constrained to move in lock-step with the evolving geometry of the fluid. This hybrid approach allows Underworld to obtain accurate velocity solutions (on the mesh) for a given material configuration, while simultaneously ensuring the accurate advection of material interfaces and history information (using particle swarms).

  • Moresi, Louis; Quenette, Steve; Lemiale, Vincent; Mériaux, Catherine; Appelbe, Bill; Muhlhaus, Hans-B; Giordani, Julian; Velic, Mirko; May, David; Farrington, Rebecca; Sharples, Wendy; Freeman, Justin; Mansour, John; Sunter, Patrick; Turnbull, Rob; Hodkinson, Luke; (2018, October 4). underworldcode/underworld1: Final version of UW1 (Version v2016-final). Zenodo. http://doi.org/10.5281/zenodo.1445812  /  https://github.com/underworldcode/underworld1

    • Underworld1 is a parallel, particle-in-cell finite element code for large-scale geodynamics simulations. It enables modelling 2- and 3-dimensional geodynamics processes, utilising a particle-in-cell finite element approach for solution to Stokes flow type configurations. In Underworld, the finite element mesh can be static or dynamic, but it is not constrained to move in lock-step with the evolving geometry of the fluid. This hybrid approach allows Underworld to obtain accurate velocity solutions (on the mesh) for a given material configuration, while simultaneously ensuring the accurate advection of material interfaces and history information (using particle swarms).

  • Owen Kaluza, Louis Moresi, John Mansour, David G Barnes, & Steve Quenette. (2020, July 21). lavavu/LavaVu: v1.6 (Version 1.6). Zenodo. http://doi.org/10.5281/zenodo.3953434 / https://github.com/lavavu/LavaVu

    • LavaVu is a scientific visualisation library with a python interface built for interactive visual analysis and collaborative work within python notebook environments while utilising local or remote hardware. The acronym stands for: Lightweight, Automatable Visualisation and Analysis Viewing Utility, but "lava" is also a reference to its original application as a viewer for geophysical simulations. The emphasis is on 4D datasets. Rendering is done in OpenGL and C++ with a python interface wrapper. Interactive visualisations in IPython are supported via a threaded web interface that allows leveraging of remote GPU resources on the same hardware the data is stored while sending only image frames back to the client. Jupyter, JupyterLab, Nteract and Google Colab environments are all supported. Output can be completely scripted in python and from these scripts, animations and video output produced from models. WebGL output can also be generated to produce client side, single .html file 3D visualisations and WebVR support allows use with virtual reality devices.

  • Steve Quenette. (2020, February 17). quenette/COMPASS-C: Phd Final (Version Phd). Zenodo. http://doi.org/10.5281/zenodo.3669377 / https://github.com/quenette/COMPASS-C

  • Steve Quenette. (2020, February 17). quenette/COMPASS-I: Phd Final (Version Phd). Zenodo. http://doi.org/10.5281/zenodo.3669379 ​/ https://github.com/quenette/COMPASS-I

Papers

Yet to be entered here, but readily found elsewhere.

  • Mansour, J., Giordani, J., Moresi, L., Beucher, R., Kaluza, O.,  Velic, M., Farrington, R., Quenette, S., Beall, A., 2020, Underworld2:  Python Geodynamics Modelling for Desktop, HPC and Cloud, Journal of Open  Source Software, 5(47), 1797, https://doi.org/10.21105/joss.01797

 

Banner image: Quenette, Steve; Kaluza, Owen; Moresi, Louis (2021): Latrobe Valley images from the Auscope Geothermal demonstrators. Monash University. Figure. https://doi.org/10.26180/15001218 (CC BY 4.0)