PhD Student Projects
ExaGEO equips students with the skills, knowledge, and principles of exascale computing — drawing from geoscience, computer science, statistics, and computational engineering — to tackle some of the most pressing challenges in Earth and environmental sciences and computational research. Students will work under expert supervision in the below fields:
- Atmosphere, hydrosphere, cryosphere, and ecosystem processes and evolution
- Geodynamics, geoscience and environmental change
- Geologic hazard analysis, prediction and digital twinning
- Sustainability solutions in engineering, environmental, and social sciences
Each student will be positioned within a supervisory team consisting of multidisciplinary supervisors; one computational, one domain expert, and one from an Earth or environmental, and/or social science research background. This ‘team-based’ supervisory approach is designed to enhance multidisciplinary training.
Please note that some of the projects listed below currently have incomplete supervisory teams, however the full teams will be finalised before the start of the PhD.
Project Selection and Information
- You must apply for three projects. Each project has two project variations, i.e., teaser projects. During your first year, after working on both teaser projects (under the same supervisory team), you will select the project that best aligns with your interests. For further information on how this process will work, please see the FAQs section on our Apply page.
- Your PhD institution will be determined by the Principal Supervisor’s institutional affiliation.
- You can apply for projects at different institutions.
- Projects are grouped by research field.
- Projects are funded via ExaGEO; this includes fees, stipends and a Research Training Support Grant. For further information, please see our Apply page.
Below is a list of our currently available projects. We encourage applications from students from diverse cultural and disciplinary backgrounds who are eager to take ownership of their research journey while benefiting from ExaGEO’s cutting-edge resources and interdisciplinary expertise. ExaGEO is committed to advancing Equality, Diversity and Inclusion (EDI) in our training
Projects with a focus on Atmosphere, Hydrosphere, Cryosphere, and Ecosystem Processes and Evolution:
-
Advancing prediction at the soil-water interface through data assimilation and exascale computing
Project institution:Lancaster UniversityProject supervisor(s):Prof Jess Davies (Lancaster University), Prof Lindsay Beevers (University of Edinburgh), Dr Simon Moulds (University of Edinburgh) and Prof Gordon Blair (Lancaster University)Overview and Background
Soil-water interactions are fundamental in a number of environmental processes and play a pivotal role in flood management, plant growth, and nutrient cycling. However, these interactions are highly complex and we currently rely on computationally intensive process-based models to help understand these processes and predict their influence on ecosystem services. With recent advances in satellite imagery and sensing, soil moisture data and other relevant data products are now available at spatial and temporal scales suitable for enhancing these models. However, integrating large volumes of data with these complex models is challenging. This studentship focuses on taking advantage of new exascale computing approaches to facilitate data assimilation, exploring how the fusion of big-data with soil-water process models can help unlock new insights and understanding.
Teaser Project 1: Improve process-based model representation of the long-term effects of extreme weather on soil carbon and nutrient cycling through remote sensing data assimilation
The lack or over-abundance of water can have large effects on plants, especially on annual crops where water conditions can severely affect the plant’s growth and survival. With changing water patterns and increasing frequency of heat waves and extreme rainfall events, the effects on plant productivity are expected to be large, and there will be knock-on effects for soil carbon storage and nutrient cycling in the longer-term. Remote sensing offers many data products that can provide us with data-based insights into plant productivity and soil moisture conditions. However, remote sensing of soil carbon is much more difficult, and understanding of the long-term response to changes in plant productivity still requires process-based models. In this project we will experiment with combining remote sensing data and process-based models to better understand the long-term effects of extreme weather on soil carbon and nutrient cycling.
Methods and PhD Pathway:
- The process-based model N14CP, which simulates plant-soil carbon, nitrogen and phosphorus cycling and export of dissolved nutrients to waterways will be adapted to assimilate (Gross or Net) Primary Productivity (GPP and NPP) and soil moisture data remote sensing products.
- The student will explore a range of approaches: from direct insertion to machine learning methods. The first teaser will begin with NPP as this has a direct proxy in the model. Freely available datasets for example from MODIS and SMAP that match the spatial resolution of the model will provide a starting point.
- To develop this path into a full PhD: multiple methods for assimilation will be explored; two-way learning between data and models will be considered; and methods for making data-assimilation real-time will be explored, with the use of exascale/GPU computing, helping move towards a digital twin.
Teaser Project 2: Estimate the contribution of soils to mitigating or increasing flood risk in a case study catchment by combining remote sensed soil moisture data and hydrological models
Antecedent soil saturation conditions can play a significant role in mitigating or increasing flood risk. If the soil has significant water held in storage, then its ability to act as a storage during times of high rainfall reduces.
Soil moisture is an important component in semi-distributed or distributed hydrological models, however, it is not routinely updated dynamically throughout the process of a simulation. With newly available satellite observations soil moisture is now available at good resolution temporally and spatially, such that is could be used to improve flood routing and water balance within catchment hydrological models.
Combining remote sensing and soil water probes offers an opportunity to develop real-time estimation for soil water, across catchments. The student will explore different approaches to data assimilation and upscaling to the catchment scale.
Methods and PhD pathway:
- Combining soil moisture estimates into hydrological and eventually hydraulic modelling for flood inundation estimation will entail significant challenges; firstly in the assimilation of data, secondly through the computational burden, and thirdly through the coupling of models in an online dynamic manner.
- Each of these challenges requires different innovative study and a range of methods.
- The student will be able to pick one or more of these three challenges to explore and develop into a full PhD, should they choose this pathway.
- For example, the coupling between hydrology requires the exploration of different coupling approaches, and will require consideration of processes such as the Basic Model Interface (BMI).
References and Further Reading
-
ArctExa: Towards Exascale Computing for Monitoring Arctic Ice Melt
Project institution:Lancaster UniversityProject supervisor(s):Prof Mal McMillan (Lancaster University), Dr Dave McKay (University of Edinburgh), Dr Jenny Maddalena (Lancaster University) and Dr Israel Martinez Hernandez (Lancaster University)Overview and Background
This project offers the exciting opportunity to be at the forefront of research to exploit the potential of exascale computing, for the purposes of satellite monitoring of Earth’s polar regions, at scale.
The Arctic is one of the most rapidly warming regions on Earth, with ongoing melting of the Greenland Ice Sheet and Arctic ice caps making a significant contribution to global sea level rise. As Earth’s climate continues to warm throughout the 21st Century, ice melt in the Arctic is expected to accelerate, leading to large-scale social and economic disruption.
Satellites provide a unique tool for monitoring the impact of climate change upon the Arctic, and are key to tracking the ongoing contribution that ice masses make to sea level rise. With recent increases in data volumes, computing power and the use of data science, comes huge potential to rapidly advance our ability to monitor and predict changes across this vast and inaccessible region. However, currently this potential is not fully realised.
This project will place you at the forefront of this research, working to advance our current capabilities towards exascale computing, through a combination of state-of-the-art satellite datasets, high performance compute, and innovative data science methods. You will be supported by a multidisciplinary supervisory team of statisticians, computer scientists and environmental scientists, with opportunities to contribute to projects run by the European Space Agency. Specifically, this project aims to develop new large-scale estimates of surface meltwater fluxes from all Arctic ice sheets and ice caps into the ocean and, in doing so, better constrain their contribution to sea level rise over the past two decades.
Methodology and Objectives
Project Aim: This project aims to utilise new streams of satellite data, alongside advanced statistical algorithms and compute, to transform our ability to monitor glacier melt at the pan-Arctic scale. More specifically, the successful candidate will develop new estimates of ice cap and ice sheet melt using high-volume, high-resolution datasets from the latest NASA and ESA satellite altimeters. These will be used to determine the first large-cale estimates of meltwater run-off into the Arctic Ocean.
Methods Used: This project will build upon recent proof-of-concept work developing Kalman Smoothing Data Assimilation techniques to create and analyse a unique record of ice melt. The focus of this PhD will be to apply these methods to the latest high-volume satellite altimetry datasets, and to do so at a massive scale. To fully exploit these big data streams and to do so at the pan-Arctic scale, will necessitate the use of Graphical Processing Units (GPU’s) on High Performance Computing (HPC) clusters. As such, developing the code to work on this high-level computing architecture will be a key element of the project. Within the first year of the PhD, the successful candidate will have the opportunity to explore 2 teaser projects, one of which will then be taken forward into subsequent years.
Teaser Project 1: High Resolution Measurements of Greenland Ice Melt over the past 15 years
This teaser project will develop novel estimates of Greenland ice melt over the past 15 years, based upon state-of-the-art CryoSat-2 swath altimetry satellite data. Specifically, a Kalman Smoothing approach, which has recently been tested within our group at several small-scale sites, will be further developed and deployed at scale, with the aim of mapping elevation changes across the entire ice sheet at high resolution. To achieve this, will require the current prototype code to be refactored and then deployed for the first time on GPU-enabled systems. Depending on the progress made, there will also be the opportunity to integrate other data streams, for example to include complementary measurements from the Sentinel-3 high-resolution Synthetic Aperture Radar altimeters.
Teaser Project 2: Towards pan-Arctic Monitoring of Ice Melt
The second teaser project will make use of the same Kalman Smoothing approach introduced above. This will ensure close synergy and complimentary between both of the first year teaser projects, thereby ensuring that the student reaps maximum gain from the development of their technical skills around this subject. Here, the student will deploy, for the first time, the Kalman Smoothing approach to monitor a small, and highly sensitive, Arctic ice cap, such as Austfonna on the Svalbard Archipelago. These smaller ice caps represent a more challenging target for satellite-based monitoring, and so alongside CryoSat-2 the student will also test the use of complementary ICESat-2 photon counting altimetry within the Kalman framework. Because of the high data volumes and the longer-term ambition to operate at the pan-Arctic scale, this project will again work to deploy the chain on GPUs.
In later years of the PhD, depending upon the student’s interests, there will be the opportunity either to extend this work to integrate output from Regional Climate Model simulations, to build more sophisticated machine learning elements into the processing chains, or to utilise other diverse streams of high-volume data, such as ultra-high resolution Digital Elevation Models or historical satellite missions.
References and Further Reading
- Antarctica’s ice is melting 5 times faster than in the 90s
- Climate change: Satellite fix safeguards Antarctic data
- Greenland lost a staggering 1 trillion tons of ice in just four years
- CPOM
- CEEDS
-
Decoding biological colour: leveraging AI to analyse big data on animal images in a changing world
Project institution:Lancaster UniversityProject supervisor(s):Dr Sally Keith (Lancaster University), Prof Christopher Nemeth (Lancaster University), Dr David Roy (Lancaster University) and Dr Christopher Cooney (University of Sheffield)Overview and Background
Understanding biodiversity is crucial to predict the impacts of global change and inform conservation strategies. One aspect of biodiversity that has been largely overlooked is biological colour, due to its complex quantification compared to other facets of biodiversity such as species richness. We therefore have little understanding of why species are the colours they are, its role in mediating interactions with other species (reproduction, competition, predation) and with the environment (e.g., thermoregulation), and the implications of environmental change on this role (e.g., effect of changing background on camouflage ability). One reason research in this area is limited is that quantification of colour and pattern is conceptually challenging and computationally intensive. To advance our understanding of biological colour, new more efficient techniques must be developed for both its quantification, visualisation and interpretation.
Methodology and Objectives
Biodiversity is entering the realms of ‘big data’ with the rapid growth in Citizen Science, emerging sensors for more automated monitoring, and progress towards digital twins. These approaches are generating large image datasets in near-real-time. Leveraging recent advancements in machine learning (ML) and artificial intelligence (AI), we can unlock unprecedented insights into these datasets. One area of interest is in enabling detailed analysis of biological colouration and patterning. State-of-the-art techniques such as convolutional neural networks (CNNs) and vision transformers (ViTs) can allow precise quantification of colour distributions across species and ecological communities. Through such approaches, it becomes possible to monitor and predict the effects of environmental change on biological colouration, informing conservation strategies and advancing our understanding of ecological processes.
Teaser Project 1: Have community colourscapes shifted over time in response to altered environmental conditions?
Objectives:
- Quantify colour across insect species from citizen science images.
- Reveal the diversity of colour and pattern across UK insect species.
- Follow-on. Create new methods to determine assemblage-scale colourscapes.
- Follow-on. Determine if and how insect assemblage colouration has changed over time and space, and explore the implications of that change for ecosystem function
This project draws on the concept of bioacoustic “soundscapes” to explore assemblage-scale shifts in the context of environmental change. For the teaser project, the student will apply CNNs and ViTs to quantify the colour of multiple insect species through access to two main image datasets: (1) images submitted via the iRecord citizen science platform, which includes >5 million photos spanning >20,000 UK species; and (2) standardised images of nocturnal insects captured by the UKCEH Automated Monitoring of Insects (AMI) sensor network in Central America, Africa, and Asia involve 40 devices generating ~2Tb of image data per year. This quantification will provide the foundation for a larger project coupling these colour quantifications with temporal assemblage data derived from decades of species observations across the UK (UKCEH Biological Records Centre; GBIF). This integration would enable analysis of how colourscapes – the frequency and distribution of colour across co-occurring species – have changed nationally over time, and the ecological implications of such changes. To engage conservation stakeholders, the student could later develop interactive geospatial visualisations of colour changes over time. Once calibrated, the project could be expanded to other regions and taxa, particularly where AMI sensors are deployed, and/or with coral reef fishes and birds leveraging supervisory expertise.
Teaser Project 2: Does habitat degradation disrupt the evolutionary match between animal colouration and its background?
Objectives:
- Quantify coral reef background colouration using computational methods applied to 3D photogrammetry data.
- Reveal the relationship between habitat colour diversity and reef degradation.
- Follow-on. Assess potential mismatch between reef fish colouration and habitat colour in degraded versus healthy reefs, and its implication for ecological function.
This project explores how coral reef habitat colour diversity is affected by environmental degradation using 3D models built from photogrammetry. The teaser project will build towards a broader exploration of how environmental change impacts the ecological and evolutionary functions of biological colouration. While current research on biological colouration focuses mainly on organisms, understanding their ecological context requires quantifying habitat background – the visual stage on which ecology plays out. Animal perception of colour depends on background contrast, hue, and brightness, which affect clarity and can induce colour shifts. Environmental change altering habitat background colour may heighten risks such as predation, disrupted sexual selection, or intensified competition. Coral reefs are an ideal model system because degradation shifts habitats from diverse, colourful coral to relatively uniform algae. Coral reefs are increasingly mapped in 3D via photogrammetry, creating a growing repository of models for analysis (e.g., Operation Wallacea, MARS project) and facilitating new model generation. 3D models overcome the limitations of standard photographs, which often fail to capture the structural complexity and dynamic colouration of reef habitats. Reef fishes offer a rich testbed for investigating how shifts in habitat colour affect evolved colouration. Coral reefs also provide essential ecosystem services for billions yet are among the most threatened globally, with stressors like bleaching altering habitat colour and reducing background diversity.
References and Further Reading
- Hemingson et al (2024) Analysing biological colour patterns from digital images: An introduction to the current toolbox. Ecology & Evolution (click here)
- Koneru & Caro (2022) Animal coloration in the Anthropocene. Frontiers in Ecology & Evolution (click here)
- Cuthill et al (2017) The biology of color. Science (click here)
- Cooney et al (2022) Latitudinal gradients in avian colourfulness. Nature Ecology & Evolution (click here)
- Caves et al (2024) Backgrounds and the evolution of visual signals. Trends in Ecology & Evolution (click here)
-
Developing GPU-accelerated digital twins of ecological systems for population monitoring and scenario analyses
Project institution:University of GlasgowProject supervisor(s):Prof Colin Torney (University of Glasgow), Prof Juan Morales (University of Glasgow), Prof Rachel McCrea (Lancaster University), Dr Tiffany Vlaar (University of Glasgow) and Prof Dirk Husmeier (University of Glasgow)Overview and Background
This PhD project focuses on advancing ecological research by using high-resolution datasets and GPU computing to develop digital twins of ecological systems. The study will concentrate on a population of free-roaming sheep in Patagonia, Argentina, examining the relationship between individual decision-making and population dynamics. Using data from state-of-the-art GPS collars, the research will investigate the impact of an individual’s condition on activity budgets and space use, and the dual influence of parasites on behaviour and energy balance. The digital twins will enhance the accuracy of population-level predictions and offer a versatile and transferable framework for ecosystem monitoring, providing critical insights for environmental policy, conservation strategies, and sustainable food systems.
Methodology and Objectives
A digital twin is a virtual replica of a physical system that can be used to investigate the system’s dynamics, predict potential failures, and optimise decision-making processes. What distinguishes digital twins from other simulation models is their ability to continuously update with real-time data. This feature allows them to represent the current state of the system accurately, ensuring that the model is consistently learning from empirical data. The focus of this PhD is to develop a digital twin of an ecological system. This digital twin will serve as a platform for exploring methods of capturing the emergent distribution of vital rates. It will also facilitate updates based on new data and will therefore function as both a learning tool and a forecasting tool, predicting future states of the system under different scenarios. One of the central objectives of this project is to determine the extent to which the individual-level dynamics can be simplified without compromising prediction accuracy. The digital twin developed will be an individual-based model, with rules for individual behaviour and space use informed by empirical data and theoretical principles. The landscape in which the animals move will be composed of GIS layers derived from remote sensing data and vegetation maps, while movement and activity data will be supplied by GPS collar devices. The project will employ statistical inference techniques and GPU-based simulations of multiple individuals and populations in parallel, using a process of importance sampling and resampling based on data. This process will ensure that only model parameters for which the simulations are consistent with observations are retained, and the resulting posterior distributions of parameter values are iteratively refined. In this way, the digital twin will efficiently represent animals transitioning between different behaviours and landscape usage, while monitoring their energy gains and losses.
Teaser Project 1: Predicting individual behaviour, space use, and condition
The first project will focus on the individual-level. Using telemetry data and periodic measurements of individual body condition, models will be developed to predict individual behaviour, space use, and changes to condition. The project will explore how individuals transition between behaviours and how this influences, and is influenced by, their internal state and condition. The impact of parasites on behaviour and energy balance will also be investigated. The project will develop recharge models which capture the internal state of an animal as several state variables that are either depleted or replenished depending on the activities of the animal. Individual-based models will be used to simulate complex decision-making processes as a function of internal states and environmental features. Model predictions will be compared to empirical data and the mismatch between the prediction and the observations will be used to refine the model and update estimates of parameter uncertainty.
Teaser Project 2: Connecting individuals to population dynamics
The second project will explore different potential modelling approaches to capture the dynamics of wildlife populations. The digital twin will serve as a benchmark model that encapsulates multiple facets of the complex dynamics of the system. More abstract population models will be developed that ignore or summarize individual variation, spatial heterogeneity, and the feedbacks between individual behaviour and condition. These models will be compared to the digital twin to assess the capacity of coarser representations of the system to accurately predict population declines depending on environmental conditions. The comparison will consider the trade-offs between model accuracy and the time and energy required to run the models.
References and Further Reading
- Blair, G. S. (2021). Digital twins of the natural environment. Patterns, 2(10)
- Hooten, M. B., Johnson, D. S., McClintock, B. T., & Morales, J. M. (2017). Animal movement: statistical models for telemetry data. CRC press
- Torney, C. J., Morales, J. M., & Husmeier, D. (2021). A hierarchical machine learning framework for the analysis of large scale animal movement data. Movement ecology, 9, 1-11
- Kavwele, C. M., Hopcraft, J. G. C., Morales, J. M., Nyafi, G., Kimuya, N., & Torney, C. J. (2024). Real‐time classification of Serengeti wildebeest behaviour with edge machine learning and a long‐range IoT network. Canadian Journal of Zoology
-
Downscaling and Prediction of Rainfall Extremes from Climate Model Outputs (RainX)
Project institution:University of GlasgowProject supervisor(s):Dr Sebastian Gerhard Mutz (University of Glasgow) and Dr Daniela Castro-Camilo (University of Glasgow)Overview and Background
In the last decade, Scotland’s rainfall increased by 9% annually and 19% in winter, with more water from extreme events, posing risks to the environment, infrastructure, health, and industry [Sniffer, 2021]. Urgent issues such as flooding, mass wasting, and water quality are closely tied to rainfall extremes [Sniffer, 2021]. Reliable predictions of extremes are, therefore, critical for risk management. Prediction of extremes, which is one of the main focuses of extreme value theory [Friederichs, 2010], is still considered one of the grand challenges by the World Climate Research Programme [Alexander et al., 2016]. This project will address this challenge by developing novel statistical, computationally efficient models that are able to predict rainfall extremes from the output of GPU-optimised climate models.
Methodology and Objectives
General Circulation Models (GCMs) are the primary tools for predicting future climate change [IPCC, 2023; and references therein]. While these GCM simulations are suitable for studies investigating climate dynamics and changes on coarse spatiotemporal scales, their skill in predicting local-scale climate and extremes remains very limited [IPCC, 2023]. Statistical Downscaling (SD) addresses this problem by linking coarse climate information to local-scale observational data [e.g., Hewitson et al., 2014; Mutz et al., 2021] using statistical models, which enables us to ”translate” GCM output to predictions that are more relevant for regional impact studies and adaptation measures. This project will leverage recent advances in SD for extremes [Cuba et al., 2024+] to develop a set of algorithms for predicting rainfall extremes in Scotland from the output of the latest GPU-optimised GCM ICON [Giorgetta et al. 2018]. These will be integrated into the user-focused, open-source tool pyESD [Boateng and Mutz, 2023]. Both teaser projects will rely on 2 datasets: 1) meteorological observations that capture rainfall extremes in Scotland (i.e., the ”predictand” dataset), and 2) a dataset used for SD model fitting (i.e., the ”predictor” dataset).
Teaser Project 1: “Perfect Prognosis” Approach
In the perfect prognosis approach, SD models are constructed from observation-based datasets for both the predictand and predictors. These models, therefore, capture real-world conditions and relationships, aiding model validation and improving our physical understanding of local-scale predictand variability. Another strength of this approach is the ability to couple the SD models to any GCM or dataset (e.g., Ramon et al., 2021), making them highly transferable. The predictor dataset for Teaser Project 1 will be ERA5 reanalysis data [Hersbach et al., 2020]. The SD models will then be coupled to 21st century simulations conducted with the GCM ICON [Giorgetta et al. 2018] to predict future changes in rainfall extremes in Scotland.
Teaser Project 2: “Model Output Statistics” Approach
Model Output Statistics also uses an observation-based predictand dataset, but the predictor dataset is simulated with climate models (e.g., GCMs). The relationships captured in the resulting SD models do not reflect physical processes as much as in the perfect prognosis approach, and the SD models are fine-tuned to a specific climate model. However, when used in tandem with this climate model, the approach often produces more accurate results and v. 2024.11.24 excels at climate model bias correction (e.g., Sachindra et al., 2014). The predictor dataset for Teaser Project 2 will be ICON simulations for the present-day climate. The SD models will then be coupled to 21st century ICON simulations to predict future changes in rainfall extremes in Scotland.
References and Further Reading
- Alexander, L.V., Zhang, X., Hegerl, G. & Seneviratne, S.I. (2016). Implementation Plan for WCRP Grand Challenge on Understanding and Predicting Weather and Climate Extremes – the “Extremes Grand Challenge”. Version, June 2016 (click here)
- Boateng, D. & Mutz, S. G. (2023). pyESDv1.0.1: an open-source Python framework for empirical-statistical downscaling of climate information. Geosci. Model Dev., 16, 6479–6514 (click here)
- Coles, S. G. (2001). An Introduction to the Statistical Modeling of Extreme Values. London: Springer. Cuba, M.D., Wilkie, C., Scott, M. & Castro-Camilo, D. (2024+). Data fusion for threshold exceedances using a censored Bayesian hierarchical model. To appear
- Friederichs, P. (2010). Statistical downscaling of extreme precipitation events using extreme value theory. Extremes, 13, 109-132
- Giorgetta, M. A., Brokopf, R., Crueger, T., Esch, M., Fiedler, S., Helmert, J., et al. (2018). ICON-A, the atmosphere component of the ICON Earth system model: I. Model description. Journal of Advances in Modeling Earth Systems, 10, 1613–1637 (click here)
- Hersbach, H., Bell, B., Berrisford, P., Hirahara, S., Horányi, A., Muñoz-Sabater, J., et al. (2020). The ERA5 global reanalysis. Q. J. Roy. Meteor. Soc., 146, 1999–2049 (click here)
- Hewitson, B. C., Daron, J., Crane, R. G., Zermoglio, M. F. & Jack, C. (2014). Interrogating empirical-statistical downscaling. Clim. Change, 122, 539–554 (click here)
- IPCC (2023). Geneva, Switzerland, 35-115 (click here)
- Mutz, S. G., Scherrer, S., Muceniece, I. & Ehlers, T. A. (2021). Twenty-first century regional temperature response in Chile based on empirical-statistical downscaling. Clim. Dynam., 56, 2881–2894 (click here)
- Ramon, J., Lledó, L., Bretonnière, P.-A., Samsó, M. & Doblas-Reyes, F. J. (2021). A perfect prognosis downscaling methodology for seasonal prediction of local-scale wind speeds. Environ. Res. Lett., 16, 054010 (click here)
- Sachindra, D. A., Huang, F., Barton, A. & Perera, B. J. C. (2014). Statistical downscaling of general circulation model outputs to precipitation – part 2: bias-correction and future projections. Int. J. Climatol., 34, 3282–3303 (click here)
- Sniffer (2021): ‘Third UK Climate Change Risk Assessment Technical Report: Summary for Scotland’ (click here)
-
Foundational models for Ecology
Project institution:Lancaster UniversityProject supervisor(s):Dr Kit Macleod (Lancaster University), Dr Alex Bush (Lancaster University) and Dr Clare Rowland (Lancaster University)Overview and Background
Satellite missions generate data on the scale of exabytes (millions of terabytes), so the need for efficient data analysis has become paramount. Geospatial Foundation Models (GFM) are currently revolutionizing our approach to machine learning by using vast unlabelled databases of satellite images to self-supervise model training. The completed GFMs then require minimal additional labelled data to be trained for new applications like flood and wildfire mapping. In this project you will explore opportunities for GFMs to improve habitat mapping in support of conservation and ecosystem restoration. This project will combine detailed long-term monitoring datasets collected by UKCEH to understand how such state-of-the-art GFMs can benefit ecology and support land management under a changing climate.
Methodology and Objectives
The latest GFM (December 2024), developed by NASA and IBM, is called Prithvi-EO2 and combines multi-spectral data from millions of Landsat and Sentinel images collected over a decade. The combined remote-sensing and software-engineering expertise behind the Prithvi model projects is state-of-the-art. The tools to fine-tune GFMs are open-source (available on Hugging Face and GitHub), including TerraTorch, so there is enormous scope for this research. For example, one recent advance includes the integration of multi-satellite sensor data e.g. ICESat-2 and GEDI (Global Ecosystem Dynamics Investigation) to fine-tune the GFM to predict above ground biomass.
The basic structure of these teaser projects are to take highly detailed UK-specific data sets to fine-tune the GFM and to assess whether the outputs exceed standard modelling approaches and data sets.
Teaser Project 1: Can fine-tuning GFMs be used for improved habitat mapping of habitat type and condition?
Accurate data on habitat type and habitat condition is crucial for making informed decisions about the UK’s land. This project will explore how GFMs could be used to improve the quality of our habitat data enabling more informed decisions, particularly habitat condition which is difficult to monitor.
Objectives:
- Fine-tuning the GFM – Can species records from structured surveys, like the National Plant Monitoring Scheme (NPMS), or the National Forest Inventory (NFI) be used to fine-tune GFMs to improve habitat assessment? Can unstructured citizen-science data also be incorporated?
- Assessing the output – How do GFM habitat classifications compare to traditional machine learning approaches such as random forest? Do they provide more insight, or different insights, to traditional approaches?
- Developing the role of GFMs in ecology – Review 1.1 and 1.2. to develop an approach for how to use GFMs in ecology, including the type of field survey data that is most beneficial for GFM fine-tuning, and the ecological use cases where GFMs are most (and least) beneficial.
Teaser Project 2: Can fine-tuned GFMs be used for detecting ecological change of habitats?
Thousands of satellite images are freely available for every site of interest in the UK. But converting this data into actionable information is complex. This project will explore the use of fine-tuning GFMs to extract useful information from these massive time-series of satellite data to better understand ecological change.
Objectives:
- Fine-tuning the GFM – Monitoring and understanding change over time is important, but data sets are limited. The first step will be to review available data sets, to see which are most suitable for detecting ecological change. The most suitable data sets will then be used to train the GFM to detect ecological change for key case study sites where known changes have occurred e.g. rewilding and other forms of habitat restoration.
- Case studies – Ecological changes are often slow, and the impact of management actions
may take years to be observed? Can GFMs track changes within, or between closely related, habitats that resulted from degradation or restoration? Habitat quality is as important as quantity and is a key indicator for many national and local policies. Which indicators of ecosystem condition can GFMs predict accurately? - Understanding the role of GFM’s in ecology – Review 2.1 and 2.2 to understand how GFM’s can be used to help ecologists, including identification of outlier sites for further investigation, and to select where new labelled training data is expected to add most value to train the GFMs.
References and Further Reading
- Hunt, M. L., Blackburn, G. A., Siriwardena, G. M., Carrasco, L., & Rowland, C. S. (2023). Using satellite data to assess spatial drivers of bird diversity. Remote Sensing in Ecology and Conservation, 9(4), 483-500
- Jakubik, J., Roy, S., Phillips, C.E et al. (2023). Foundation Models for Generalist Geospatial Artificial Intelligence. arXiv, 2310.18660
- Marston, C. G., O’Neil, A. W., Morton, R. D., Wood, C. M., & Rowland, C. S. (2023). LCM2021–the UK land cover map 2021. Earth System Science Data, 15(10), 4631-4649
- Szwarcman, D., Roy, S., Fraccaro, P. et al. (2024). Prithvi-EO-2.0: A Versatile Multi-Temporal Foundation Model for Earth Observation Applications. arXiv, 2412.02732
- Artificial Intelligence for Science, NASA Science
- Expanded AI Model with Global Data Enhances Earth Science Applications, NASA Science
- IBM/terratorch: a Python toolkit for fine-tuning Geospatial Foundation Models (GFMs)
-
GPU-accelerated multiscale modelling for glacier sliding
Project institution:University of GlasgowProject supervisor(s):Dr Andrei Shvarts (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow) and Prof Todd Ehlers (University of Glasgow)Overview and Background
Global climate and environmental change are increasingly resulting in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions, at the national or large scale, of the impacts of environmental extremes on natural and urban settings. This project is one, stand alone, component of this larger scale project.
In this project, you will focus on the glacier-bedrock interface. Mountain glaciers are changing rapidly worldwide in response to climate change. Glacier changes affect global trends in freshwater availability, contribute to recent sea level changes, and affect regional water resources over the twenty-first century. However, uncertainties remain in projecting such impacts in future climate change scenarios. A major source of these uncertainties is the lack of understanding of glacier sliding – the relative motion between glacial ice and underlying rocks (Zoet, L. K., & Iverson, N. R., 2020). In this project, you will develop a GPU-accelerated multiscale modelling framework for glacier sliding to tackle this problem.
Your job while working on this project will involve software development for simulating the relevant multiphysical processes, applying the model to historic data for validation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Subglacial system consists of ice, water, and rocks. Among these components, different processes and feedbacks operate at different spatial and temporal scales, making it a challenging computational problem to simulate. To tackle this problem, this project will adopt a multiscale modelling approach combining microscale modelling of the ice-bedrock interface with macroscale simulation of glacier dynamics and subglacial hydrology. A particular focus will be developing models for GPU architectures to enable high-resolution and scalable simulation.
Methods used in this project will involve numerical simulation using the open-source parallel finite element library MoFEM developed and supported at the University of Glasgow (Kaczmarczyk, Ł., et al, 2020).
Teaser Project 1:
This teaser project, conducted during the first year, will focus on developing a microscale model of the contact interface between glacier ice and bedrock. When the ice is compressed by its own weight against the bedrock, the roughness of both the ice and bedrock surfaces creates a complex contact problem where only isolated regions of the interface are in actual contact. Simultaneously, the remaining areas form free volumes that can be occupied by flowing or stagnant (trapped) water mixed with sediments. The sub-project will build upon a previously developed finite-element framework (Shvarts, A.G. et al., 2021) by enabling its application in a distributed-memory parallel computing environment and providing further GPU acceleration using the functionality available in the MoFEM library. Additionally, the framework will be enhanced to incorporate friction between the ice and bedrock. Using available data to calibrate the model, the extended framework will predict the shear strength of the interface as a function of various parameters, including the weight of the ice, surface roughness, sediment density, and water pressure. These predictions will be compared with existing phenomenological models of glacial sliding to refine and improve the latter.
Teaser Project 2:
This sub-project, also conducted during the first year, addresses the macroscale problem and focuses on the behaviour of the glacier as a whole. It will leverage the finite-element model implemented in MoFEM, constructed using available topological and geological data. The model will incorporate the results from the microscale simulations of the first sub-project, which map ice properties to interfacial shear strength, to accurately inform the macroscopic interface behaviour. Utilizing parallel computing with GPU acceleration, this sub-project will simulate large-scale glacier dynamics under varying environmental conditions, including changes in ice thickness, surface temperature, and basal water pressure. These simulations will provide critical insights into the glacier’s flow patterns, deformation, and sliding behaviour, enabling predictions of its response to climate change scenarios. The outcomes will also help validate and refine existing phenomenological models, improving their applicability to real-world glacier systems.
References and Further Reading
- Zoet, L. K., & Iverson, N. R. (2020). A slip law for glaciers on deformable beds. Science, 368(6486), 76–78 (click here)
- Kaczmarczyk, Ł., et al, 2020. MoFEM: An open source, parallel finite element library. The Journal of Open Source Software, 5(45) (click here)
- Shvarts, A.G., Vignollet, J. and Yastrebov, V.A., 2021. Computational framework for monolithic coupling for thin fluid flow in contact interfaces. Computer Methods in Applied Mechanics and Engineering, 379, p.113738 (click here)
-
How will climate change affect stratospheric ozone recovery in the Arctic?
Project institution:Lancaster UniversityProject supervisor(s):Dr James Keeble (Lancaster University), Prof Michèle Weiland (University of Edinburgh), Prof Ryan Hossaini (Lancaster University) and Dr Luke Abrahams (University of Cambridge)Overview and Background
The stratospheric ozone layer is expected to recover over the course of the 21st century due to the controls the Montreal Protocol places on ozone depleting substances. As a result, the Montreal Protocol is considered by many to be the most successful environmental treaty of all time, and for some serves as a blueprint for how to tackle the climate crisis. However, the Arctic continues to see years with large ozone depletion, and a recent study has suggested that under future scenarios that assume large greenhouse gas emissions polar ozone depletion in the Arctic might get worse. If polar ozone depletion is worsening, this has significant implications for not just the stratosphere, regional climate, and human health, but also on how we interpret the success of the Montreal Protocol and its role as an example for other environmental policy efforts. This project will use recent advances in exascale computing and GPU hardware to model the Arctic stratosphere at unprecedented resolution in a coupled Earth system framework to address these scientific questions.
Methodology and Objectives
This project’s key scientific aim is to examine variability in Arctic stratospheric ozone and explore whether Arctic stratospheric ozone is recovering as expected. This will be achieved through examination of satellite observations and model simulations of the recent past, and high resolution, coupled Earth system modelling to explore atmospheric processes and year-to-year variability in this important region of the atmosphere. This project will use advances in exascale computing and GPU technology to run coupled Earth system models at much higher resolution and for much longer (many centuries) than has been done in the past, and to use that output, alongside other large datasets, to build new models of stratospheric ozone through emulation and machine learning processes.
Teaser Project 1: The drivers of year-to-year variability in Arctic stratospheric ozone over the recent past
This research project focuses on our understanding of the physical processes driving year-to-year Arctic ozone variability, and the impacts these have on surface climate and extreme weather. Key objectives are:
Objective 1: Examine historical Arctic ozone values using observations and model datasets from activities such as CMIP6 and CCMI-2022 to explore extreme years. Particular focus will be on the large Arctic ozone depletion events observed in the winters of 2010/11 and 2019/20, and the record high ozone levels observed in March 2024. Key questions include: are these events linked to atmospheric variability, or evidence of an emerging trend related to climate change? To what extent do models and observations agree?
This teaser project will be developed into a full PhD by using very high-resolution model simulations performed with the UKESM1 model, and exploring the following objectives:
Objective 2: Perform long, high resolution UKESM1 simulations to further explore atmospheric processes. To what extent does higher resolution allow us to better model processes such as orographic gravity waves, dynamical asymmetries in the polar vortex, polar stratospheric cloud formation, and chemistry-climate interactions.
Objective 3: Explore the impact of extreme high and low Arctic polar ozone events on regional weather and climate in high resolution model runs, with a focus on the UK and Europe, and how this might contribute to regional climate change signals.
Teaser Project 2: Using large datasets to develop faster, computationally inexpensive projections of future ozone change in the Arctic
This research project uses the huge amount of data we have from observations and recent model intercomparison projects to develop simple models of Arctic ozone that can provide reliable projections of future ozone recovery without the need to run complex, expensive climate models. Key objectives are:
Objective 1: Analyse past and future changes to Arctic ozone in CCMI-2022 and CMIP6 model simulations to get a sense of how Arctic ozone has changed in the past and is expected to change in the future. Identify the extent to which models and observations agree.
Objective 2: Using machine learning approaches, develop a computationally inexpensive emulator trained on the multi model CMIP6 and CCMI-2022 dataset that can make projections of Arctic stratospheric column ozone under different climate states.
This teaser project will be developed into a full PhD by developing the capabilities of the emulator model beyond 1-dimensional stratospheric ozone column projections:
Objective 3: Develop the emulator model of Objective 2 so that it can make projections of three dimensionally resolved (latitude-longitude-altitude) ozone in the Arctic stratosphere.
Objective 4: Perform high resolution UKESM1 model simulations looking at polar ozone under a range of future climate states. Key areas to explore will be (1) changes to stratospheric dynamics and temperature in response to greenhouse gas emissions, (2) the role of stratospheric water vapour increases in driving changes to polar ozone depletion, and (3) the role of large wildfires and associated soot particles as sites of heterogeneous chemistry. Use these new simulations to further refine the emulator developed in Objective 3.
References and Further Reading
- von der Gathen et al., Climate change favours large seasonal loss of Arctic ozone. Nature Communications, 12(1), 3886, 2021
- Polvani et al., No evidence of worsening Arctic springtime ozone losses over the 21st century. Nature Communications, 14, 1608, 2023
- Newman et al., Record High March 2024 Arctic Total Column Ozone. Geophysical Research Letters, 51, e2024GL110924, 2024
- Chapter 4 of the WMO/UNEP Scientific Assessment of Ozone Depletion: 2022 (click here)
-
Measuring Biodiversity from volunteer generated ecological data sources
Project institution:University of GlasgowProject supervisor(s):Prof Ana Basiri (University of Glasgow), Joseph Shingleton (University of Glasgow), Dr Stuart Sharp (Lancaster University), Dr Lydia Bach (University of Glasgow) and Dr Urska Demsar (University of St Andrews)Overview and Background
Citizen science has enabled researchers to have access to an unprecedented large-scale, affordable, rich, and diverse data for researchers. However, many question the inherent biases and quality issues of community/citizen generated data. Within the field of ecology, communities such as iNaturalist and eBird collate hundreds of millions of georeferenced species observations from users around the world.
In order to address the challenges of quantifying the quality (accuracy, completeness, representation) of these data sources, this PhD uses state-of-the-art modelling and data science techniques, along with other data sources, such as high resolution remotely sensed raster data, to build a foundational understanding of data quality within volunteered ecological data. To achieve this, advanced techniques in computer vision, animal behaviour modelling and geo-AI will be employed, leveraging the considerable computational resources available to the ExaGEO project.
Methodology and Objectives
The two teaser projects will focus on a single aspect of data quality within crowd-sourced and community generated ecological data: identification of repeat observations. Datasets such as iNatrualist, Movebank, and eBird provide some indication of the spatial distribution of a wide variety of animal species, with some steps taken to ensure reasonable data quality. However, there is currently no protocol for identifying multiple observations of the same individual within a species.
The two teaser projects outlined below take different and complimentary approaches to estimating the likelihood that two observations of animals from the same species are in fact of the same individual. The output of these models may be used in downstream tasks later in the PhD project to assess overall data quality of the datasets and enable quantifiably measure the reliability of the outputs for biodiversity.
Methods Used:
- Advanced computer vision (e.g. vision transformer models)
- Spatially explicit AI models (e.g. Graph Attention Networks)
- Animal behaviour modelling (e.g. ODEs/PDEs)
- Remote sensed data processing/analysis- e.g. pixel or object-oriented classification of satellite data
- Advanced statistical analysis (e.g. Markov chain Monte Carlo (MCMC), Hidden Markov, etc.)
Teaser Project 1
The first teaser project uses advanced computer vision techniques to identify similarities between photographs of animals and distinguish between individuals of the same species. To achieve this, a set of species specific keypoint models will be developed which are able to locate key identifiable features within an image (e.g. facial landmarks, joints, limbs). The relative location of these features, along with the geo-location of the observation and other data, will then be used within an unsupervised clustering model to identify likely observations of the same individual.
The success of this part of the project will rely on careful consideration of the taxon/a of study. Factors such as animal physiology, data availability and quality, and image processing and analysis techniques will play an important role in deciding this. Ultimately, the researcher will decide their area of study, after careful deliberation with the supervisory team.
Teaser Project 2
The second teaser project involves the development of a spatio-temporal model for animal behaviour. A single observation of an individual animal consists of (at minimum) species information, a geolocation and a timestamp. By combining these with other data sources (e.g., land-cover data, remotely sensed data, estimated animal populations) and animal behaviour expertise, the researcher will construct a spatio-temporal model capable of estimating the likelihood that an observation of the same species at a different geo-location and time is the same individual animal.
The approach taken can use either mechanistic models (e.g. ODEs, PDEs) or statistical machine learning models (e.g. MCMC, Hidden Markov, GeoAI), or indeed may employ a combination of both (e.g. Particle/Kalman filtering, approximate Bayesian computation).
While these two projects have the same overall aim – namely, estimating the likelihood two observations are of the same individual – their methodologies are very different. Beyond the first year of the PhD, the models created in these projects may be combined, resulting in a single model which uses positional, temporal and visual information to identify repeat observations. Later work may use this model to investigate other aspects of data quality within volunteer ecological data in even more detail. This will be used for some of ongoing relevant project on balancing quantity-quality of crowdsourced data.
References and Further Reading
- Lauer et. al. (2022), Multi-animal pose estimation, identification and tracking with DeepLabCut, Nature Methods
- Hou et. al. (2020), Identification of animal individuals using deep learning: A case Study of giant panda, Biological Conservation
- Vidal et. al. (2021), Perspectives on Individual Animal Identification from Biology and Computer Vision, Integrative and Comparative Biology
- Wahltinez, O. and Wahltinez, S. J. (2024) An open-source general purpose machine learning framework for individual animal re-identification using few-shot learning, Methods in Ecology and Evolution
- Laxton, M. R. et. al. (2022) Balancing structural complexity with ecological insight in Spatio-temporal species distribution models, Methods in Ecology and Evolution
- Karppinen S. et. al. (2022) Identifying territories using presence-only citizen science data: An application to the Finnish wolf population, Ecological Modelling
- Dorazio, R. M. and Karanth, K. U. (2017) A hierarchical model for estimating the spatial distribution and abundance of animals detected by continuous-time recorders, PLOS One
- Supp et. al. (2021) Estimating the movements of terrestrial animal populations using broad-scale occurrence data, Movement Ecology
- INaturalist
- Movebank
- EBird
-
Pace and style of glacial erosion in the Patagonian Andes
Project institution:University of GlasgowProject supervisor(s):Dr Jingtao Lai (University of Glasgow), Dr Katie Miles (Lancaster University), Dr Sarah Falkowski (University of Glasgow), Dr Sebastian Mutz (University of Glasgow) and Dr Mirjam Schaller (University of Glasgow)Overview and Background
Glacial erosion plays a critical role in the feedback mechanisms between different Earth systems. Rates and patterns of glacial erosion are controlled by climate variations, and glacial erosion can, in turn, influence the climate by modulating the carbon cycle through chemical weathering and ecosystem changes. Despite its importance, significant uncertainties remain regarding how climate affects the rates and spatial patterns of glacial erosion. The Patagonian Andes, with its broad latitudinal range and rich observational data, offers a valuable natural laboratory to address these questions. This project aims to integrate glacial landscape evolution models with thermochronology data to explore the pace and style of glacial erosion in the Patagonian Andes over the past 10 million years.
Methodology and Objectives
This project will integrate glacial landscape evolution modelling with thermochronology data to investigate glacial erosion in the Patagonian Andes. The student will use the Fastscape landscape evolution model and the Instructed Glacier Model (IGM), a glacier dynamics model that employs a Physics-Informed Neural Network (PINN) approach. Dr Lai, the project supervisor, has successfully integrated IGM with Fastscape. The student will use this modelling framework to simulate glacial landscape evolution in the Patagonian Andes. Low-temperature thermochronology provides valuable insights into erosion history by recording the time a rock sample takes to travel from a given depth to the Earth’s surface. In this project, the student will integrate model results from landscape evolution simulations with thermochronology data. Using the simulated evolution of glacial topography as input, the student will use the Pecube model to generate synthetic thermochronological datasets. These results will be compared with existing thermochronology data from the Patagonian Andes, offering new perspectives on the region’s glacial erosion history. The overall technical objective of this project is to develop a robust and scalable GPU-based modelling framework for landscape evolution in glacial environments. A key aspect of the project will be optimizing the existing code for efficient multi-GPU simulations, enabling large-scale landscape evolution simulations. This project will also involve incorporate climate models and other Earth surface process models into this modelling framework, including orographic precipitation, landslides, and sediment transport.
Teaser Project 1: Investigate valley-scale temporal evolution of glacial erosion rate in the Patagonia Andes
The hypothesis of a global increase in erosion rates due to the expansion of glaciation since the Late Cenozoic (~25 million years ago) remains a topic of intense debate and controversy. While modern glaciers are indeed more erosive than rivers, the response time of glacial erosion — specifically, how long elevated erosion rates persisted following the onset of glaciation — remains uncertain. In the Patagonian Andes, low-temperature thermochronology studies suggest that the onset of glaciation triggered a transient pulse of rapid erosion, followed by a gradual decline in erosion rates toward preglacial levels over response timescales spanning millions of years.
The objective of this teaser project is to understand the transient evolution of a glacial valley after the onset of glaciation and quantify the response time of glacial erosion. The student will connect glacial landscape evolution models with other Earth surface process models, including models for landslides and sediment transport. The student will focus on understanding the potential feedback mechanisms during glacial landscape evolution and exploring response times of glacial erosion in various climatic and tectonic conditions.
Teaser Project 2: Investigate reginal-scale spatial pattern of glacial erosion in the Patagonian Andes
The Patagonian Andes span a broad latitudinal range, offering a unique natural laboratory to study how glacial erosion responds to varying climatic conditions. Previous research suggests that glacial erosion rates are influenced by factors such as temperature, precipitation, and the basal thermal conditions of glaciers. However, a comprehensive quantitative assessment of the climatic impact on glacial erosion is still missing. This teaser project aims to integrate glacial landscape evolution models with existing climate reconstructions to simulate the regional history of glacial erosion in the Patagonian Andes. The focus will be on evaluating how latitudinal variations in temperature and precipitation shape the spatial patterns of basal thermal regimes and glacial erosion. Additionally, the model will be coupled with an orographic precipitation model to explore feedback mechanisms between topographic evolution and climate. The simulated spatial patterns of glacial erosion will be compared with those inferred from thermochronology data, providing new insights into the climatic controls on glacial erosion.
References and Further Reading
- Herman, F., Seward, D., Valla, P. G., Carter, A., Kohn, B., Willett, S. D., & Ehlers, T. a. (2013). Worldwide acceleration of mountain erosion under a cooling climate. Nature, 504(7480), 423–426 (click here)
- Herman, F., De Doncker, F., Delaney, I., Prasicek, G., & Koppes, M. (2021). The impact of glaciers on mountain erosion. Nature Reviews Earth & Environment (click here)
- Jouvet, G., & Cordonnier, G. (2023). Ice-flow model emulator based on physics-informed deep learning. Journal of Glaciology, 1–15 (click here)
- Lai, J., & Anders, A. M. (2021). Climatic controls on mountain glacier basal thermal regimes dictate spatial patterns of glacial erosion. Earth Surface Dynamics, 9(4), 845–859 (click here)
- Willett, C. D., Ma, K. F., Brandon, M. T., Hourigan, J. K., Christeleit, E. C., & Shuster, D. L. (2020). Transient glacial incision in the Patagonian Andes from ~6 Ma to present. Science Advances, 6(7), eaay1641 (click here)
-
Scalable approaches to mathematical modelling and uncertainty quantification in heterogeneous peatlands
Project institution:University of GlasgowProject supervisor(s):Dr Raimondo Penta (University of Glasgow), Dr Vinny Davies (University of Glasgow), Prof Jessica Davies (Lancaster University), Dr Lawrence Bull (University of Glasgow) and Dr Matteo Icardi (University of Nottingham)Overview and Background
While only covering 3% of the Earth’s surface, peatlands store >30% of terrestrial carbon and play a vital ecological role. Peatlands are, however, highly sensitive to climate change and human pressures, and therefore understanding and restoring them is crucial for climate action. Multiscale mathematical models can represent the complex microstructures and interactions that control peatland dynamics but are limited by their computational demands. GPU and Exascale computing advances offer a timely opportunity to unlock the potential benefits of mathematically-led peatland modelling approaches. By scaling these complex models to run on new architectures or by directly incorporating mathematical constraints into GPU-based deep learning approaches, scalable computing will to deliver transformative insights into peatland dynamics and their restoration, supporting global climate efforts.
Teaser Project 1: Scalable Mathematical Modelling of Peatlands
Objectives: This project will explore how we can do scalable modelling and inference on mathematical models of peatlands. The project will take existing microscale models for peatlands and look at how we can perform mathematical optimisation to the learn the complex parameters of the mathematical model. The focus will then be on looking at how the model can be upscaled and improved, focusing on computational inference methods that will be applicable as the model gets expanded and becomes more computationally infeasible.
Methods: The project will use scalable mathematical processing and optimisation techniques, looking at how they compared to computational statistical inference methods such a Bayesian optimisation. The peatland model will be used for simulations and analysed to understand how it can be improved to model the complex non-linear processes. Future work as part of a potential PhD project would involve extending model and adapting it to be able to run in high performance computing environments and extending the optimisation techniques to work in this scenario.
PhD Project: The main purpose of the PhD project will be scaling up the peatland models, adding more features and scaling them to be able to run across computer clusters and on GPUs with the eventual aim of extending this to Exascale computing. Advanced mathematical techniques will be used to upscale from micro- to macroscale models incorporate nonlinear instabilities such as wrinkling and surface patterning. Computational methods will then be extended to focus on predicting long-term peatland behaviour under restoration scenarios and climatic stressors. The integration of experimental data for validation and refinement of models will ensure practical applicability.
Teaser Project 2: Mathematically Informed Machine Learning for Scalable Peatland Modelling
Objectives: This project will explore how we can used emulation techniques for scalable parameter inference and uncertainty quantification in an existing model for peatlands. We will use parallelised computing to run multiple simulations of the peatland model and then use GPU based deep learning methods to build an emulator. The emulator will provide a computational cheaper version of the original model, allowing us to use Bayesian inference in previously computational infeasible scenarios giving the ability to estimate the model parameters and their associated statistical uncertainty.
Methods: The project will make use of deep learning architectures that are designed to be specifically scalable to GPUs and eventually Exascale type infrastructures. Specifically, the emulation methods will use and compare deep neural networks and deep Gaussian processes to link model parameters to observed model outcomes. Optimisation and Bayesian inference will then be carried out using the emulator within the context of an inverse problem. Future work as part of a potential PhD project would involve extending these methods into more complex deep learning frameworks, e.g. physics informed machine learning or graph neural networks.
PhD Project: A potential follow-on PhD project would focus on incorporating the mathematical models directly into the deep learning structures and linking the model to real data. This could be achieved by making the models more scalable by replacing the mathematical finite element methods via GPU trained deep learning alternatives or through methods from the physics informed machine learning literature. . Linking the model to real data will also be computationally challenging, building either directly on the emulation methods from the initial project or through the mathematical informed machine learning methods that have been developed. Essentially, this project will aim to link this model to real world applications that can help us gain a better understanding of the structure of peatlands.
References and Further Reading
-
Scalable Inference and Uncertainty Quantification for Ecosystem Modelling
Project institution:University of GlasgowProject supervisor(s):Dr Vinny Davies (University of Glasgow), Prof Richard Reeve (University of Glasgow), Prof David Johnson (Lancaster University), Prof Christina Cobbold (University of Glasgow) and Dr Neil Brummitt (Natural History Museum)Overview and Background
Understanding the stability of ecosystems and how they are impacted by climate and land use change can allow us to identify sites where biodiversity loss will occur and help to direct policymakers in mitigation efforts. Our current digital twin of plant biodiversity provides functionality for simulating species through processes of competition, reproduction, dispersal and death, as well as environmental changes in climate and habitat, but it would benefit from enhancement in several areas. The three this project would most likely target are the introduction of a soil layer (and the improvement of the modelling of soil water); improving the efficiency of the code to handle a more complex model and to allow stochastic and systematic Uncertainty Quantification (UQ); and developing techniques for scalable inference of missing parameters.
Teaser Project 1: Computational: Port core EcoSISTEM code to GPU
This project, led by Davies, will analyse the core CPU routines in EcoSISTEM, and port them to GPU. This will use packages from the JuliaGPU ecosystem, in particular CUDA.jl, a Julia package that provide a relatively easy user interface to NVIDIA A100 GPUs, which are available on UG’s MARS HPC system that the student will have access to. The main branch of the EcoSISTEM code is already efficiently parallelised for CPUs, and a preliminary assessment has suggested that the porting task should be feasible within a teaser project. This work will require support from Reeve both for his understanding of EcoSISTEM and his general Julia and HPC experience. This teaser project can be extended in a variety of ways to a full PhD:
On the one hand, once the GPU port speed-ups have been realised, the student can add two major new components to EcoSISTEM. First, Uncertainty Quantification of both the variability across stochastic realisations and systemic variability from parametric uncertainty can be feasibly added to the code, to allow us to better understand the uncertainty in possible outcomes. Second, the student can investigate scalable inference techniques for parameter inference within EcoSISTEM. These are both areas in which Davies has extensive experience.
On the other hand, there is a more sophisticated (dev) development branch of EcoSISTEM that is not currently well optimised but allows greater flexibility of how interactions can occur between components of the model. Porting this to GPUs will be a significantly harder task, but will allow richer interactions to be more easily modelled between ecosystem components. Incorporating additional ecological components into the GPU port of EcoSISTEM, such as those described in the ecological teaser project below, would also be possible. These are areas where Johnson, Cobbold and Brummitt’s expertise will be critical.
Teaser Project 2: Ecological: Incorporate key soil properties and relevant plant traits into EcoSISTEM
This project, led by Johnson, will add a preliminary model of the interaction between plant species, the species’ soil-specific traits, including fungal and microbial associations, and key soil properties including not just broad chemical, physical, biological measures but also their microbiomes, based on existing work by Johnson, with help from Cobbold (for development the soil-plant ecological model), Brummitt (for the botanical expertise) and Reeve (for integration into EcoSISTEM). This aspect of the project can be extended in many ways. This can involve enhancing the plant-soil modelling by adding in more aspects of the interaction or by improving the soil-water modelling to allow consideration of infiltration and soil water release characteristics, which more important than the existing spot-measure of moisture content. It can also involve porting aspects of the model to GPU, and adding in other aspects (UQ, inference), referred to in the computational teaser project, above. All of these aspects of the project will require Davies’s expertise.
References and Further Reading
- Digital twins of the natural environment (click here)
- Dynamic virtual ecosystems as a tool for detecting large-scale responses of biodiversity to environmental and land-use change (click here)
- Effective extensible programming: Unleashing Julia on GPUs (click here)
- Strong phylogenetic signals in global plant bioclimatic envelopes (click here)
- Land management shapes drought responses of dominant soil microbial taxa across grasslands (click here)
Projects with a focus on Geodynamics, Geosciences and Environmental Change:
-
Investigate the response of proglacial fluvial systems to glacier retreat using GPU-accelerated numerical simulations
Project institution:University of GlasgowProject supervisor(s):Dr Amanda Owen (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Richard Williams (University of Glasgow) and Prof Todd Ehlers (University of Glasgow)Overview and Background
Recent climate change has driven glacier retreat worldwide, releasing increasing volumes of sediments and meltwater into proglacial environments (Zhang et al., 2022). Combined with an increase in extreme weather events, this has triggered rapid geomorphic changes in proglacial fluvial systems (Heckmann et al. 2016). These changes pose significant risks to downstream areas, threatening infrastructure, food security, and ecological stability. Despite their importance, our understanding of 1) how proglacial rivers respond to increased meltwater and sediment influx from retreating glaciers and 2) how glacier-river system collectively responds to long-term climate trends and short-term weather extremes remains limited.
This PhD project seeks to address these knowledge gaps through advanced GPU-based numerical simulations. By developing and coupling models for sediment dynamics and glacier evolution, the research will explore the interplay between glaciers and proglacial rivers under varying climatic and environmental conditions. The student will focus on developing and validating the numerical model, designing and conducting simulations to understand key interactions and mechanisms, and applying the model in selected field locations to provide predictions.
Methodology and Objectives
This project aims to use GPU-based numerical simulations to investigate the responses of proglacial fluvial systems to both long-term climate changes and short-term weather variations. The work will involve developing new, efficient code for simulating sediment dynamics in rivers on GPU devices and/or coupling existing sediment dynamics models with a GPU-based glacial landscape evolution model. The simulations will be validated against field observations and integrated with environmental datasets to provide robust predictions of how proglacial river systems may evolve under future climate scenarios.
Teaser Project 1: Investigate sediment dynamics and geomorphic changes in proglacial fluvial system
As glaciers continue to retreat due to climate change, their downstream river systems experience dynamic and complex adjustments in sediment transport, channel morphology, and erosional/depositional patterns. The first teaser project aims to simulate and understand the rapid geomorphic changes in proglacial fluvial systems driven by variations in upstream meltwater and sediment fluxes.
The student will build upon existing sediment dynamics models such as SPACE (Shobe et al., 2017) and Eros (Davy et al., 2017) to create a GPU-based high-resolution 2D model tailored for proglacial rivers. This model will explicitly simulate sediment entrainment, transport, and deposition processes, allowing for detailed exploration of fluvial responses to varying upstream inputs. Advanced computational techniques, including parallel processing, will be employed to ensure efficiency and scalability.
Using this model the student will design and conduct scenario-based simulations to explore geomorphic changes in proglacial rivers under different conditions of meltwater and sediment input. A sensitivity analysis will then be performed to identify key parameters, such as slope gradients, channel geometry, and sediment grain size distribution, that influence sediment dynamics and channel evolution. Finally, the model will be calibrated and validated against field or experimental data to ensure accuracy and robustness, providing a foundation for broader application to various proglacial systems.
Teaser Project 2: Investigate the coupled evolution of glacier-river system driven by climate change and weather extremes
Recent climate change and increased weather extremes significantly impact both glacier dynamics and proglacial fluvial systems. By coupling glacier and sediment dynamics models, the second teaser project seeks to understand the coevolution of retreating glaciers and proglacial river systems under these influences, providing insights into their interconnected responses to climatic and extreme weather events.
The student will couple a GPU-accelerated glacier model (IGM; Jouvet and Cordonnier 2023) with a fluvial sediment dynamics model, such as SPACE or Eros. This integrated model will be able to simulate the response of the coupled glacier-river system to climate and weather variations.
The student will design and conduct a group of simulations covering a range of climate scenarios and investigate how climate-driven glacier retreat impacts proglacial river processes. The student will analyse the simulations to explore various scenarios of meltwater and sediment flux release under different climatic conditions, identifying distinct geomorphic responses of proglacial rivers and determining whether glacier retreat results in sedimentation, erosion, or a combination of both. Finally, the model outputs will be validated against field observations or experimental data, and sensitivity analyses will be conducted to identify the primary controls on river responses, providing robust insights into the coevolution of glaciers and proglacial river systems.
References and Further Reading
- Davy, P., Croissant, T., & Lague, D. (2017). A precipiton method to calculate river hydrodynamics, with applications to flood prediction, landscape evolution models, and braiding instabilities. Journal of Geophysical Research: Earth Surface, 122(8), 1491–1512 (click here)
- Heckmann, T., McColl, S., & Morche, D. (2016). Retreating ice: research in pro-glacial areas matters. Earth Surface Processes and Landforms, 41(2), 271–276 (click here)
- Jouvet, G., & Cordonnier, G. (2023). Ice-flow model emulator based on physics-informed deep learning. Journal of Glaciology, 1–15 (click here)
- Shobe, C. M., Tucker, G. E., & Barnhart, K. R. (2017). The SPACE 1.0 model: a Landlab component for 2-D calculation of sediment transport, bedrock erosion, and landscape evolution. Geoscientific Model Development, 10(12), 4577–4604 (click here)
- Zhang, T., Li, D., East, A. E., Walling, D. E., Lane, S., Overeem, I., et al. (2022). Warming-driven erosion and sediment transport in cold regions. Nature Reviews Earth & Environment, 1–20 (click here)
-
Statistical Emulation Development for Landscape Evolution Models
Project institution:University of GlasgowProject supervisor(s):Dr Benn Macdonald (University of Glasgow), Dr Mu Niu (University of Glasgow), Dr Paul Eizenhöfer (University of Glasgow), and Dr Eky Febrianto (University of Glasgow)Overview and Background
Many real-world processes, including those governing landscape evolution, can be effectively mathematically described via differential equations. These equations describe how processes, e.g. the physiography of mountainous landscapes, change with respect to other variables, e.g. time and space. Conventional approaches for performing statistical inference involve repeated numerical solving of the equations. Every time parameters of the equations are changed in a statistical optimisation or sampling procedure, the equations need to be re-solved numerically. The associated large computational cost limits advancements when scaling to more complex systems, the application of statistical inference and machine learning approaches, as well as the implementation of more holistic approaches to Earth System science. This yields to the need for an accelerated computing paradigm involving highly parallelised GPUs for the evaluation of the forward problem.
Beyond advanced computing hardware, emulation is becoming a more popular way to tackle this issue. The idea is that first the differential equations are solved as many times as possible and then the output is interpolated using statistical techniques. Then, when inference is carried out, the emulator predictions replace the differential equation solutions. Since prediction from an emulator is very fast, this avoids the computational bottleneck. If the emulator is a good representation of the differential equation output, then parameter inference can be accurate.
Methodology and Objectives
Methods Used: Gaussian process interpolation (for building the emulator), Bayesian inference (for parameter inference), geomorphological analyses, surface processes modelling.
Teaser Project 1: GPU-accelerated differential equation solver
Geodynamic models in Earth Science are used to simulate a range of natural processes. Landscape evolution models specifically contain, amongst others, equations that describe surface processes such as erosion and sediment deposition as well as rock/surface uplift and aspects of climate change. However, the numerical solver executes consecutively, rather than generating solutions in parallel. This first teaser project will commence at the beginning of the PhD project (semester 1) and will focus on familiarising the student with parallel computing via GPUs, including the optimisation of existing landscape evolution models for GPU use. At the same time, the student will take training from ExaGEO, equivalent to 20 UoG credits, in GPU programming and Exascale principles. This teaser project will support the PhD project in developing robust, reliable and efficient emulators for landscape evolution models, utilising GPU power, which will allow for a denser training set and the inclusion of a broader variety of geomorphological scenarios. This teaser project will also give insight on possible GPU acceleration in the emulation process itself.
Teaser Project 2: Emulator development
The second teaser project will look at creating an emulator for a simple mathematical model describing elevation change as a function of spatial and temporal variations in surface uplift and efficiency of erosion. This will take place in semester 2 and the student will also undergo training at the same time from ExaGEO, in statistical and numerical methods in computing, complementing the students research aims at this stage. The skills the students develop during this teaser project will set them up well, in combination with what they have attained from teaser project 1, to develop efficient emulators for more complex landscape evolution models, as the PhD project evolves.
The student will be well supported by the supervisory team. Dr Eizenhöfer has expertise in landscape evolution modelling and reconstruction, Dr Macdonald and Dr Niu have expertise in developing statistical methodology in the area of statistical emulation and Dr Febrianto has expertise in highly parallelised architecture for scientific computing and will be able to advise on software development and design with open-source vision, as well as aspects of the GPU software development.
References and Further Reading
- Rasmussen, C.E., & Christopher K. I. Williams, C.K.I. (2006). Gaussian Processes for Machine Learning. The MIT Press. ISBN 0-262-18253-X
- Donnelly, J., Abolfathi, S., Pearson, J., Chatrabgoun, O., & Daneshkhah, A. (2022). Gaussian process emulation of spatio-temporal outputs of a 2D inland flood model. Water Research. Volume 225. ISSN 0043-1354
- Clark, M. K., Royden, L. H., Whipple, K. X., Burchfiel, B. C., Zhang, X., & Tang, W. (2006). Use of a regional, relict landscape to measure vertical deformation of the eastern Tibetan Plateau. Journal of Geophysical Research: Earth Surface, 111(F3)
- Eizenhöfer, P. R., McQuarrie, N., Shelef, E., & Ehlers, T. A. (2019). Landscape response to lateral advection in convergent orogens over geologic time scales. Journal of Geophysical Research: Earth Surface, 124(8), 2056-2078
- Mutz, S. G., & Ehlers, T. A. (2019). Detection and explanation of spatiotemporal patterns in Late Cenozoic palaeoclimate change relevant to Earth surface processes. Earth Surface Dynamics, 7(3), 663-679
- Whipple, K. X., Forte, A. M., DiBiase, R. A., Gasparini, N. M., & Ouimet, W. B. (2017). Timescales of landscape response to divide migration and drainage capture: Implications for the role of divide mobility in landscape evolution. Journal of Geophysical Research: Earth Surface, 122(1), 248-273
- Whittaker, A. C., & Boulton, S. J. (2012). Tectonic and climatic controls on knickpoint retreat rates and landscape response times. Journal of Geophysical Research: Earth Surface, 117(F2)
- Yang, R., Willett, S. D., & Goren, L. (2015). In situ low-relief landscape formation as a result of river network disruption. Nature, 520(7548), 526-529
- Zachos, J. C., Dickens, G. R., & Zeebe, R. E. (2008). An early Cenozoic perspective on greenhouse warming and carbon-cycle dynamics. nature, 451(7176), 279-283
-
Towards exa-scale simulations of slabs, core-mantle heterogeneities and the geodynamo
Project institution:University of GlasgowProject supervisor(s):Prof Radostin Simitev (University of Glasgow), Dr Antoniette Greta Grima (University of Glasgow) and Dr Kevin Stratford (University of Edinburgh)Overview and Background
Scientific computing is crucial for understanding geophysical fluid flows, such as the geodynamo that sustains Earth’s magnetic field. This project will adapt an existing pseudo-spectral geodynamo code for magnetohydrodynamic simulations in rotating spherical geometries to GPU architectures, improving efficiency on modern computing systems and enabling simulations of more realistic regimes. This will advance our understanding of Earth’s geomagnetic field and its broader interactions, such as those with mantle heterogeneities. Evidence from seismology and geodynamics shows that the core-mantle boundary (CMB) is highly heterogeneous, influencing heat transport and geodynamo dynamics. By combining compressible, thermochemical convection with geodynamo simulations, this project will further investigate how deep slab properties affect the CMB heat flux, mantle heterogeneity, and the geodynamo.
Teaser Project 1: What is the impact of ancient slabs on core-mantle boundary heterogeneities and the geodynamo?
Evidence from seismology and geodynamics reveals that the lowermost mantle and the core-mantle boundary (CMB) are highly heterogeneous due to the presence of post-perovskite, large low shear wave velocity provinces and ancient, subducted slab material. CMB heterogeneity results in variable heat transport from the core and plays a key role in core and mantle dynamics, the geodynamo, and ultimately the Earth’s habitability. Previous work shows that the spatiotemporal evolution of the CMB heterogeneity is closely linked to deep slab dynamics (e.g., Heron et al., 2024, 2025), however these remain poorly understood. This teaser project will investigate the role of deep slab properties on temporal evolution of the deep mantle heterogeneity, the CMB heat flux and the geodynamo. This will involve modelling compressible, multiphase, thermochemical convection in a 3D spherical shell following the approach of Dannberg et al., (2024) and Heron et al., (2024, 2025) using the state of the art, open-source, adaptive mesh refinement, finite element software ASPECT (Heister et al., 2017). These models will include the subduction history over the last 1 billion year from Merdith et al., (2021) and will be supported by high resolution 3D regional models investigating the role of end-member slab properties (e.g., weak vs. strong slabs) on the CMB heterogeneity. Temporal variations in CMB heat flux from these models will then be analysed using spherical harmonics across the first 4 harmonic degrees similar to the approach of Dannberg et al., (2024) and used as thermal boundary condition for the geodynamo simulations. The goal is to expand teaser project 1 to investigate the influence the deep slab on core-mantle dynamics and the implications this has for magnetic field generation and the strength and frequency of polarity reversals.
Objectives:
- Use global convection models to calculate the temporal evolution of heat flux at the CMB.
- Investigate the influence of end member slab rheologies and geometries on the heat flux heterogeneity at the CMB.
- Apply the calculated heat flux across the CMB from geodynamic models as a boundary condition to geodynamo simulations to investigate heterogeneity in magnetic field strength and the timing and frequency of magnetic field reversals.
- Use GPU architecture to couple finite element mantle convection with geodynamo simulations.
Teaser Project 2: Spectral expansion transforms in spherical geometry
Modelling the geodynamo involves solving the coupled 3D, time-dependent, nonlinear Navier-Stokes equations, pre-Maxwell electrodynamics, and heat transfer equations for a rotating fluid. At present, the pseudo-spectral method is the most accurate and widely used numerical discretisation method in this context. The method requires applying physical to spectral space transforms which are generally in integral form and have been difficult to adapt to GPU architectures. With GPUs becoming increasingly powerful and accessible, this sub-project aims to port an existing versatile pseudo-spectral code for magnetohydrodynamic simulations in rotating spherical geometries to GPU systems.
Objectives:
- Investigate alternative orthogonal polynomial basis function families that can be used to expand fields in spherical geometry, including Legendre, Jones-Worland, Jacobi and Galerkin.
- Implement alternatives in and assess/compare convergence, stability and consistency of the resulting discretisations as well as their efficiency for GPU acceleration.
References and Further Reading
- Dannberg, J., Gassmoeller, R., Thallner, D., LaCombe, F., & Sprain, C. (2023). Changes in core-mantle boundary heat flux patterns throughout the supercontinent cycle. arXiv preprint arXiv:2310.03229
- Paul H Roberts and Eric M King. 2013. On the genesis of the Earth’s magnetism. Rep. Prog. Phys. 76 096801 (click here)
- Gary A. Glatzmaier. 2014. Introduction to Modeling Convection in Planets and Stars: Magnetic Field, Density Stratification, Rotation. Princeton (click here)
- Heister, T., Dannberg, J., Gassmöller, R., & Bangerth, W. (2017). High accuracy mantle convection simulation through modern numerical methods – II: Realistic models and problems. Geophysical Journal International, 210(2), 833–851 (click here)
- Heron, P.J., Dannberg, J., Gassmöller, R., Shephard, G.E., & Pysklywec, R. N. (2025). The impact of Pangaean subducted oceans on mantle dynamics: passive piles and the positioning of deep mantle plumes. Gondwana Research
- Heron, P.J., Gün, E., Shephard, G.E., Dannberg, J., Gassmöller, R., Martin, E., Sharif, A., Pysklywec, R. N., Nance, R.D., & Murphy, J.B. (2025). The role of subduction in the formation of Pangaean oceanic large igneous provinces. Geological Society London, Special Publications, 542(1)
- Merdith, A.S. Williams. S.E., Brune, S., Collins, A.S., & Müller, D.R. (2021). Extending full-plate tectonic models into deep time: linking the Neoproterozoic and the Phanerozoic, Earth-Sci. Rev., 214 (click here)
- Silva. L, Simitev, R., 2018. Pseudo-spectral code for numerical simulation of nonlinear thermo-compositional convection and dynamos in rotating spherical shells, zenodo.org, 1311203, 2018 (click here)
Projects with a focus on Geologic Hazard Analysis, Prediction and Digital Twinning:
-
Developing large-scale hydrodynamic flood forecasting models for exascale GPU systems
Project institution:University of EdinburghProject supervisor(s):Dr Mark Bull (University of Edinburgh), Dr Maggie Creed (University of Glasgow), Prof Simon Mudd (University of Edinburgh) and Dr Declan Valters (British Geological Survey)Overview and Background
Flood forecasting at regional and national scale is imperative to predicting the scale and distribution of floodwaters during extreme weather events, mitigating the impact on communities most at risk from flooding. The LISFLOOD family of surface water models have proved suitable to being parallelised at scale, allowing research and forecasting communities to take advantage of the previous generation of supercomputers, such as ARCHER.
The increasing availability of high resolution topographic and meteorological data provides an opportunity to extend the capability of the LISFLOOD modelling framework to produce large-scale or high resolution flood forecasts at operational timescales – i.e producing model runs at sufficient lead-in times to alert communities to impending flood risk from forecasted extreme weather events. GPU-based exascale HPC systems provide the technological basis to develop forecast models delivering at operational timescales.
Methodology and Objectives
LISFLOOD is a family of hydrological models based on a 2D grid simulating rainfall-runoff. The water routing across a flood basin/river catchment is based on a simplified version of the shallow water (St Venant) equations. The model is process (physics) based, and there have been several implementations (see below), usually in C or C++, using a cellular automaton approach. These have been parallelised for CPU using OpenMP and in one spin-off project, MPI (see here for more details).
The stencil-code library used in the previous CSE project, LibGeoDecomp, purports to have support for NVIDIA GPUs and CUDA (see here for more details).
Teaser Project 1:
- Implement the hydrodynamic core of the LISFLOOD model on GPU hardware to demonstrate proof-of-concept that the current CPU parallelised code is portable to GPU hardware.
- Methods for GPU parallelisation would include OpenMP offloading as initial approach to verify proof of concept. Project could then be extended to investigate CUDA bindings available in the libgeodecomp library.
Teaser Project 2:
- Profile, then optimise GPU ported code and test using case studies of UK extreme flood events, to indicate potential for near-realtime flood forecasting of GPU-enabled LISFLOOD code.
- This objective would aim to deliver a minimum-working example of a the GPU-ported flood model to deliver forecasts/re-analysis of a historic flood event in the UK within an operational timescale.
Development into a full PhD would involve further profiling and optimisation of the GPU code using either the libgeodecomp library, or another suitable GPU parallelisation framework. Delivering a proof-of-concept for a working flood forecast model at a regional scale would be a key aim of this project, demonstrating the potential to be used in operational flood forecasting systems. The full PhD may therefore look at workflow tools to integrate the various stages of forecast production such as: ingestion and pre-processing of data (i.e. from rainfall forecast/nowcasting data products), model scheduling on HPC systems, and post-processing of the outputs.
References and Further Reading
- Coulthard, T.J., Neal, J.C., Bates, P.D., Ramirez, J., de Almeida, G.A. and Hancock, G.R., 2013. Integrating the LISFLOOD‐FP 2D hydrodynamic model with the CAESAR model: implications for modelling landscape evolution. Earth Surface Processes and Landforms, 38(15), pp.1897-1906
- LISFLOOD model high level overview (click here)
- Stencil Code for LibGeoDecomp (click here)
- Open Source version of the C++ code developed by Declan Valters (click here)
- Overview of an earlier project that developed an experimental version of the code for multi (CPU) node using stencil code (click here)
-
Earth system twin for landscape evolution processes
Project institution:University of GlasgowProject supervisor(s):Prof Todd Ehlers (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow) and Dr Adam Smith (University of Glasgow),Overview and Background
Global climate and environmental change increasingly result in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions at the national or larger scale for the impacts of environmental extremes on natural and urban settings. This project is one, stand alone, component of this larger scale project.
In this project you will develop and apply a landscape evolution model component of the Earth system model. We seek a student interested in surface water hydrology and landscape evolution modeling of rivers and hillslopes across Scotland. The student will develop software for investigating how weather forecasts and extreme weather events interact with geomorphic, hydrologic, and biosphere processes. Students from diverse backgrounds (e.g., geo- or hydrological sciences, engineering, maths, computer science) are welcome to apply to this project. The supervision team will take your background into account when setting the dissertation goals and provide mechanisms to lear the background information need to fill in knowledge gaps.
Your job while working on this project will involve software development for simulating the relevant physical processes, applying the model to historic data for model evaluation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Methods used in this project involved in the first year include the development of a GPU based numerical model that calculates surface water budgets (runoff, infiltration, etc) and applies the model to understand erosion, transport, and deposition of sediments as a function of fluvial and hillslope processes. The model will use meteorological forecasts, digital topography, vegetation cover, and soil/rock cover as inputs and will forecast river discharge and erosion/sedimentation. The final years of the project involve improving the model to incorporate different environmental data such as remote sensing data for land use, biota, and hydrology.
Teaser Project 1:
This teaser project, conducted in the first year, will focus on development of a GPU based flow routing algorithm for application to Digital Elevation Model (DEM) data. The focus of the project is on understanding how precipitation that falls on a landscape during different weather events will influence the amount and rate of water moving over a landscape and the resulting river discharge. The calculation of the overland flow of water and river discharge are important for understanding (see teaser project 2) what types of rainfall events lead to the mobilization of sediment and river transport, or saturation of hillslope regolith and mass wasting (landslide) events. This project will be done for large geographic regions and optimized for domain decomposition on a GPU cluster. Existing open source software (non-GPU based) exists for addressing this problem and can provide a template for development of a GPU-based version.
Initial efforts will focus on the identification of catchment boundaries and calculation of river runoff for different spatial and temporal distributions of precipitation. Time permitting, additional components of the hydrologic cycle will be added including infiltration rates as a function of different soil types, and evaporation, evapotranspiration, and snowpack melting processes. After implementing one or more of the previous configurations, the program will be applied to past meteorological events in Scotland and compared to observed river discharge.
Teaser Project 2:
This teaser project, also conducted in the first year, focuses on development of GPU software to calculate how water flowing over landscapes (overland flow) and in rivers (discharge) entrains and erodes the underlying soil, sediment, or bedrock. This project is important because projected climate change will result in more intense rainfall events that could lead to increased erosion rates, and higher sediment concentrations in rivers. For example, increased soil erosion removes nutrients needed by the biosphere and impacts agriculture practices. Too much soil erosion could therefore impact biodiversity and food security. At the start of this project you will work through learning tutorials from existing (non-GPU based) software to acquire an overview of the ‘big picture’ of processes you will address. Initial new and development efforts in this project will focus on calculating the calculating the shear stress of different amounts and velocities of water moving over a digital elevation model. These calculations will be used to determine, for different intensities of rainfall, how much sediment and rock is entrained in the flow and moved downslope. The goal would be the fast and efficient calculation of erosion rates across a landscape for different distributions of precipitation. Time permitting, the next steps of the project would include consideration of detachment/transport limiting conditions within the model and identification of where and when either erosion or deposition occur. Additional factors that can be taken into account are how different vegetation and soil types influence erosion, and including remote sensing data as model inputs for the selection of erosion related model parameters.
An example movie of the different components of this project and how a landscape evolution model works is available here.
References and Further Reading
- Sharma, H. and Ehlers, T. A.: Effects of seasonal variations in vegetation and precipitation on catchment erosion rates along a climate and ecological gradient: insights from numerical modeling, Earth Surf. Dynam., 11, 1161–1181, 2023 (click here)
- Schmid, M., Ehlers, T. A., Werner, C., Hickler, T., and Fuentes-Espoz, J.-P.: Effect of changing vegetation and precipitation on denudation – Part 2: Predicted landscape response to transient climate and vegetation cover over millennial to million-year timescales, Earth Surface Dynamics, 6, 859–881, 2018 (click here)
- Hobley, D. E. J., Adams, J. M., Nudurupati, S. S., Hutton, E. W. H., Gasparini, N. M., Istanbulluoglu, E., and Tucker, G. E.: Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics, Earth Surface Dynamics, 5, 21–46, 2017 (click here)
-
Earth system twin for landslides along UK coasts with soil-rock mixtures
Project institution:University of GlasgowProject supervisor(s):Dr Zhiwei Gao (University of Glasgow), Dr Jingtao Lai (University of Glasgow), Prof Lukasz Kaczmarczyk (University of Glasgow), Dr Martin Hurst (University of Glasgow) and Hassan Al-Budairi (QTS Group)Overview and Background
Global climate and environmental change are increasingly resulting in weather extremes that impact society and infrastructure. These extremes include stormier climates with increased wind speeds, precipitation events or drought, and temperatures (amongst other things). A team of University of Glasgow researchers are developing an Earth systems digital twin for exascale computing that works on GPU computers and uses weather forecasts to predict the cascading effect of climate change events on environmental systems. Our goal is to provide predictions, at the national or large scale, of the impacts of environmental extremes on natural and urban settings. This project is one, stand-alone, component of this larger-scale project.
In this project, you will develop and apply one component of the Earth system model. We are seeking a student interested in GPU-accelerated large deformation modelling of landslides in soil-rock mixtures (SRM) along UK coasts. SRM is a naturally occurring material composed of high-strength rock fragments embedded in a matrix of low-strength soil. It is a common geological formation found in mountainous regions, river valleys and coasts. One example is the glacial till widely seen in the UK. SRM exhibits significant heterogeneity and anisotropy due to the random distribution and varying proportions of rock blocks and soil. In this project, we will develop a GPU-accelerated multifield plasticity simulation for modelling landslides in SRM.
Your job while working on this project will involve software development for simulating the relevant physical processes, applying the model to historic data for model evaluation, working in a team/workgroup environment, attending regular research group seminars, integrating diverse environmental and satellite data into your software, and learning new techniques through ExaGEO training workshops.
Methodology and Objectives
Methods used in this project involve multiscale modelling of SRM at the element level and large deformation modelling using multifield plasticity. The result from the multiscale modelling will be used to develop a constitutive model for SRM.
Teaser Project 1: Multiscale modelling of SRM
The mechanical behaviour of SRM is governed by the interaction between its components, with rock blocks contributing to structural stability and the soil matrix often controlling deformation and failure. Its unique characteristics, such as non-uniform strength, variable permeability, a wide range of particle sizes, and complex stress distribution, make SRM challenging to test and model. For instance, measuring the stress-strain relationship of SRM requires large equipment to accommodate the rock fragments in the testing cell. Developing such equipment is time-consuming and expensive. Therefore, we will use the multiscale approach to model the element response of SRM. At the mesoscale, elements of the SRM will be modelled to capture the detailed microstructure, including the distribution and properties of rock fragments and soil. This will be done using the finite element code MoFEM which is GPU compatible. The rock fragments will be modelled as non-deformable solids and soils will be modelled using a suitable elastoplastic model. The mechanical properties obtained from the microscale models will then be upscaled to the macroscale using homogenisation techniques. The multiscale modelling results will be validated and calibrated using experimental data to ensure accuracy. This involves comparing simulation results with laboratory tests reported in the literature. These simulations provide effective material properties that can be used in developing constitutive models for the SRM that are needed in large-scale modelling.
Teaser Project 2: Large deformation modelling of landslides using multifield plasticity
The multifield plasticity developed at the Glasgow Computation Engineering Centre (GCEC) is a numerical method suitable for modelling large deformation problems, which eliminates the need for local integration of the elastoplastic model and can effectively exploit the computation power of GPUs. In the multifield framework, the balance of linear momentum, the flow rule, and the Karush–Kuhn–Tucker (KKT) constraints are formulated together within a variational framework. Beyond deformation, the plastic strain and the consistency parameter are treated as global degrees of freedom in the spatially discretised problem. To manage the increased number of global degrees of freedom, the method leverages the block sparse structure of the algebraic system and employs a customised block matrix solver designed to take advantage of modern hardware architectures. A constitutive model for the SRM will be implemented following the multifield plasticity and then used to model landslides in MoFEM. We will collaborate with research teams working on field observations and large-scale modelling in this development.
References and Further Reading
- Lewandowski, K., Barbera, D., Blackwell, P., Roohi, A. H., Athanasiadis, I., McBride, A., … & Kaczmarczyk, Ł. (2023). Multifield finite strain plasticity: Theory and numerics. Computer Methods in Applied Mechanics and Engineering, 414, 116101
- Gao, W. W., Gao, W., Hu, R. L., Xu, P. F., & Xia, J. G. (2018). Microtremor survey and stability analysis of a soil-rock mixture landslide: a case study in Baidian town, China. Landslides, 15, 1951-1961
- Gao, W., Yang, H., & Hu, R. (2022). Soil–rock mixture slope stability analysis by microtremor survey and discrete element method. Bulletin of Engineering Geology and the Environment, 81(3), 121
- Qiu, Z., Liu, Y., Tang, S., Meng, Q., Wang, J., Li, X., & Jiang, X. (2024). Effects of rock content and spatial distribution on the stability of soil rock mixture embankments. Scientific Reports, 14(1), 29088
- Li, J., Wang, B., Wang, D., Zhang, P., & Vardon, P. J. (2023). A coupled MPM-DEM method for modelling soil-rock mixtures. Computers and Geotechnics, 160, 105508
-
Exploring Hybrid Flood modelling leveraging GPU/Exascale computing
Project institution:University of GlasgowProject supervisor(s):Dr Andrew Elliot (University of Glasgow), Prof Lindsay Beevers (University of Edinburgh), Prof Claire Miller (University of Glasgow) and Prof Michèle Weiland (University of Edinburgh)Overview and Background
Flood modelling is crucial for understanding flood hazards, now and in the future as a result of climate change. Modelling provides inundation extents (or flood footprints) which provide outlines of areas at risk which can help to manage our increasingly complex infrastructure network as our climate changes. Our ability to make fast, accurate predictions of fluvial inundation extents is important for disaster risk reduction. Simultaneously capturing uncertainty in forecasts or predictions is essential for efficient planning and design. Both aims require methods which are computationally efficient whilst maintaining accurate predictions. Current Navier-stokes physics-based models are computationally intensive; thus this topic would explore approaches to hybrid flood models which utilise GPU-compute and ML fused with physics-based models, as well as investigating scaling the numerical models to large-scale HPC resources.
Methodology and Objectives
Methods Used: Machine learning, statistical modelling, optimised process models using GPU computation.
Teaser Project 1
Exploring the advantages and limitations of GPU enabled approaches to flood modelling in contrast to traditional process-based flood modelling. Key considerations would be characterising the computational advantage of different ML approaches (especially physics informed machine learning models) considering both training and inference and the corresponding accuracy in comparison to the traditional process-based models. In addition, we will explore enhancing traditional process-based models by investigating the opportunities for exploiting large-scale, GPU-accelerated HPC.. Using data available from a range of sources (e.g. satellite, sensor networks as well as model outputs), different ML approaches will be explored to represent the complex hydrodynamics which a process based model would capture.
This project will naturally extend to a full PhD exploring hybrid modelling approaches with a key understanding how the level of accuracy of these models.
Teaser Project 2
Uncertainty quantification is becoming increasingly important as binary predictions give at best a limited outlook on the model and at worse can be misleading to policy makers who may not consider the implications of enforcing a binary outcome to flood forecasting models, or for adaptation development. However, with particularly slow high-fidelity models, gaining accurate and meaning uncertainty estimates via Monte Carlo, is either incredibly time consuming or indeed impossible. There are multiple solutions to this, including use surrogate/ML models (which can run the simulation faster) or improved Monte Carlo procedures (e.g. see Aitken et. al. 2024). Needless to say, while this computationally useful, it is important to understand the implications for the calibration of the uncertainty quantification of these approaches.
Thus, following from Aitken et. al. 2024, in this teaser project we will consider a large range of possible approaches, use them to obtain uncertainty quantifications and compare them to the uncertainty estimation which we can obtain from a high fidelity model, e.g. using LisfloodFP or Telemac2D. Due to the computational requirements of this approach, this is likely to require large scale compute, in both traditional and GPU compute. Comparisons will then be made between the UQ relying on large compute and those developed in this teaser project, allowing an understanding of the trade-offs between these approaches.
This teaser project naturally expands into a wider PhD designing and developing novel GPU enabled methods to obtain well calibrated uncertainty estimates via a combination of statistical and machine learning techniques to give rapid outputs to decision and policy makers.
References and Further Reading
- Aitken, G.; Beevers, L.; Christie, M.A. Advanced Uncertainty Quantification for Flood Inundation Modelling. Water 2024, 16, 1309 (click here)
- Andersson, T.R., Hosking, J.S., Pérez-Ortiz, M. et al. Seasonal Arctic sea ice forecasting with probabilistic deep learning. Nat Commun 12, 5124, 2021 (click here)
- Aitken, G., Beevers, L., & Christie, M. A. (2022). Multi-level Monte Carlo models for flood inundation uncertainty quantification. Water Resources Research, 58, e2022WR032599 (click here)
- Fraehr, Niels, et al. “Assessment of surrogate models for flood inundation: The physics-guided LSG model vs. state-of-the-art machine learning models.” Water Research 252 (2024): 121202
-
Extreme air pollution during European heatwaves: detangling the drivers through ultra-high-resolution modelling
Project institution:Lancaster UniversityProject supervisor(s):Prof Ryan Hossaini (Lancaster University), Dr Andrea Mazzeo (Lancaster University), Dr Lily Gouldsbrough (Lancaster University), Dr Helen Macintyre (UK Health Security Agency) and Prof Oliver Wild (Lancaster University)Overview and Background
While heatwaves (sustained periods of hot weather) are a well-recognized public health hazard, growing evidence highlights an emerging risk from the co-occurrence of extreme temperature and air pollution[1,2]. The 2022 European heatwave, when the UK recorded its fist ever temperature >40°C, was accompanied by a widespread deterioration in air quality, with surface levels of ozone and other air pollutants exceeding safe limits across much of the continent[3]. The causal relationship between extreme temperature and extreme air pollution levels is complex, involving synoptic weather patterns affecting air movement, atmospheric chemistry, and pollutant emissions (e.g. from wildfires)[4,5]. In combination, these factors are not adequately understood or quantified but are important to detangle as the frequency and intensity of summer heatwaves will become more common due to climate change, meaning this ‘climate penalty’ for air quality could worsen[6]. This project will provide powerful new insight into the drivers of European extreme air pollution events during heatwaves and the associated health effects. This will be achieved by combining ultra-high-resolution model simulations of air pollutant behaviour, supported by satellite observations and other big observational datasets.
The successful candidate will join LEC’s vibrant atmospheric science research group: AtMOS
Methodology and Objectives
The two teaser projects are linked by an overarching theme (extreme air pollution during heatwaves), though are distinct in focus and employ different numerical modelling approaches. Teaser #1 involves a European-scale assessment with emphasis on improving scientific understanding of the underpinning processes responsible for elevating ozone during heatwaves. Teaser #2 provides a UK-scale assessment with emphasis on forecasting of extreme events and assessment of health effects. Both teasers will equip the student with key transferable skills around the acquisition/manipulation of atmospheric measurement and model datasets and the application of policy-relevant metrics to assess model performance.
Teaser Project 1:
Focussing on recent summer heatwaves, a Europe-wide assessment of the drivers of extreme ozone events will be performed. During the ‘teaser’, the temperature-ozone relationship will first be quantified by analysing measurement records from a large number of European monitoring sites, including the newly-available, extensive TOAR-II database of surface ozone observations (Year 1). The project’s principal modelling tool will be the FRSGC/UCI chemical transport model (CTM) that is developed and maintained in Lancaster. During the teaser, the ability of the model to capture elevated summertime ozone will be examined using the 2022 heatwave as a case study (Year 1).
If developed into a full project, the work will be expanded in scope to cover other notable European heatwaves (e.g. summer 2018). In addition to surface measures, the behaviour of ozone and the model’s ability of to capture it, will be further evaluated with satellite measurements of atmospheric composition (e.g. ozone from the GOME-2 and IASI instruments). The focus of the project in Years 2/3 will be to detangle the drivers of elevated ozone using carefully designed model sensitivity experiments. These will allow, for example, the significance of temperature-induced increases in ozone precursors emissions to be explored, including biogenic volatile organic compound emissions from vegetation (e.g. isoprene) and wildfires (CO, NOx), along with assessment of the long-range transport of ozone from outside of continental Europe (including from the stratosphere).
Objectives:
- Characterize the European ozone-heatwave response across multiple summers using a suite of surface and satellite measurements.
- Assess the ability of the FRSGC/UCI atmospheric chemical transport model to capture extreme ozone events and the observed ozone-temperature relationship.
- Interpret the observed ozone-heatwave response using high resolution model simulations. Explore multiple factors, including the relative role of meteorological versus chemical drivers, the effects of model horizontal resolution, and other assumptions.
Teaser Project 2:
Process-based air quality models are frequently used to ‘hindcast’ the state of air quality over a given region, providing information required to assess the impact of changing air pollutant levels on public exposure and health. Additionally, such models are now increasingly used to alert the public and health care providers in advance of upcoming air pollution episodes (i.e. ‘forecast’). Like weather forecasts, air quality forecasts may be provided up to several days ahead, with forecast confidence generally decreasing with increasing lead time. As forecast skill is often inadequate, particularly for the most ‘extreme’ episodes, a body of literature on possible approaches to ‘bias correct’ forecasts (before they are issued) has emerged, some of which involve near real-time data assimilation[7-9].
This project will examine the ability of WRF-Chem to simulate UK air quality in both hindcast and forecast modes, with an emphasis on heatwave periods. WRF-Chem is a well-evaluated and widely adopted model suitable for high resolution simulations at the country scale. During the project’s ‘teaser’ part (Year 1), the skill of WRF-Chem to forecast surface ozone will be assessed considering a range of forecast lead times (24 to 96 hours) and by applying a range of key metrics (e.g., hit rate, false alarm rate etc.). For model evaluation, we will utilise the UK’s extensive ‘AURN’ network of air pollutant measurements. If developed into a full project, the student will explore the effectiveness of a range of bias correction techniques (Year 2), with emphasis on developing and implementing a scheme that improves the tail of the ozone distribution in both hindcast and forecast set ups. Bias-corrected hindcasts will be produced and the annual mortality burden attributable to long-term air pollutant exposure will be quantified[10] (Year 3). This analysis will be performed in time slices from ~1990 to present, allowing the effectiveness of air quality legislation on health burdens over time to be quantified.
Objectives:
- Evaluate the skill of the WRF-Chem model in reproducing surface ozone and other air pollutants over the UK during recent heatwaves.
- Assess the efficacy of a range of bias correction techniques (e.g. ‘quantile mapping’) applied to WRF-Chem hindcasts and forecasts.
- Produce bias-corrected hindcasts of key air pollutants and quantify the associated human health effects of air pollution in the UK over time.
References
- Schnell, J.L., and Prather, M.J. (2017). Co-occurrence of extremes in surface ozone, particulate matter, and temperature over eastern North America. Proc. Natl. Acad. Sci., 114, 2854-2859 (click here)
- Gouldsbrough, L., Hossaini, R., Eastoe, E., & Young, P.J.Y. (2022). A temperature-dependent extreme value analysis of UK surface ozone, 1980-2019. Atmos. Env., 273, 118975
- Copernicus scientists warn of very high ozone pollution as heatwave continues across Europe
- Pope, R. J., et al. (2023). Investigation of the summer 2018 European ozone air pollution episodes using novel satellite data and modelling, Atmos. Chem. Phys., 23, 13235-13253 (click here)
- Otero, N., Jurado, O. E., Butler, T., and Rust, H. W. (2022). The impact of atmospheric blocking on the compounding effect of ozone pollution and temperature: a copula-based approach, Atmos. Chem. Phys., 22, 1905-1919 (click here)
- Doherty, R.M., Heal, M.R., and O’Connor, F.M. Climate change impacts on human health over Europe through its effect on air quality, Environ. Health, 16 (click here)
- Neal, L.S., Agnew, P., Moseley, S., Ordóñez, C., Savage, N.H. and Tilbee, M. (2014). Application of a statistical post-processing technique to a gridded, operational, air quality forecast, Atmos. Env., 98, 385-393 (click here)
- Staehle, C., et al. (2024). Technical note: An assessment of the performance of statistical bias correction techniques for global chemistry–climate model surface ozone fields, Atmos. Chem. Phys., 24, 5953-5969 (click here)
- Gouldsbrough, L., Hossaini, R., Eastoe, E., Young, P.J.Y. & Vieno, M. (2023). A machine learning approach to downscale EMEP4UK: analysis of UK ozone variability and trends. Atmos. Chem. Phys., 24, 163-3196 (click here)
- Macintyre, H.L., et al. (2023). Impacts of emissions policies on future UK mortality burdens associated with air pollution. Environ. Int., 174, 107862 (click here)
Further Reading
- How UK’s record heatwave affected air pollution
- World meteorologists point to ‘vicious cycle’ of heatwaves and air pollution
- Schnell, J.L., and Prather, M.J. (2017). Co-occurrence of extremes in surface ozone, particulate matter, and temperature over eastern North America. Proc. Natl. Acad. Sci., 114, 2854-2859 (click here)
- Pope, R. J., et al. (2023). Investigation of the summer 2018 European ozone air pollution episodes using novel satellite data and modelling, Atmos. Chem. Phys., 23, 13235-13253 (click here)
-
Investigating the rheology of volcanic granular flows with GPU-based Discrete Element Method
Project institution:University of EdinburghProject supervisor(s):Dr Kevin Stratford (University of Edinburgh), Dr Eric Breard (University of Edinburgh) and Prof Jin Sun (University of Glasgow)Overview and Background
Geophysical flows, including those produced by landslides, volcanic eruptions, and extreme weather events are among the most pervasive natural hazards, posing significant risks to society. Despite their profound impact, our understanding of their complex rheology, driving the extent and speed of these flows, remains limited, posing significant challenges to the development of accurate hazard models. Major challenges lie in capturing the role of non-sphericity of the particulate mixtures generally assumed to be spheres and transient behaviour—often oversimplified as steady-state—and unravelling how these flows interact with substrates, driving sedimentation or substrate entrainment that influences their reach and dynamics. Leveraging the advanced GPU capabilities of the new open-source solver MFiX-Exa, we aim to simulate these complex multiphase processes with unparalleled precision. Our work will derive simplified constitutive equations that account for mass and momentum changes due to sedimentation or substrate entrainment, transforming predictive models and enhancing hazard mitigation strategies.
Teaser Project 1: Capturing flow initiation and arrest (jamming transition) in particle geophysical flows
Objective: This project focuses on gas-particle geophysical flows, with implications for landslides, pyroclastic flows, and debris flows. Our aim is to uncover how transient processes govern the transition from inertial flows to jamming (e.g., the sudden stopping of particles), which leads to deposition and momentum loss—an aspect currently missing from existing models. Using simulations, we will investigate the effects of grain size distribution and pore fluid pressure on deposition rates. The ultimate goal is to derive sedimentation and erosion rate laws that can be integrated into depth-averaged models, enabling more accurate hazard predictions.
Methods: Our approach combines high-resolution simulations using the discrete element method (DEM) coupled with computational fluid dynamics (CFD) to unravel the complex dynamics of geophysical granular flows. We will utilise the novel open-source solver MFIX-Exa, into which we will add the missing physics necessary to describe water-particle interactions, including lubrication forces, added mass, and Saffman lift forces. These additions are essential for accurately modelling bedload transport and debris flows.
We will then simulate the interaction of granular media, both with and without excess pore pressure, as it impacts a loose substrate. This will allow us to track flow-substrate dynamics as the base of the flow transitions to a jammed state. Post-processing of the DEM-CFD data will be conducted using a coarse-graining approach to derive continuum fields (e.g., stress tensors, velocity, granular temperature). These fields will facilitate the derivation of constitutive equations describing flow rheology and flow-substrate interactions, specifically the sedimentation processes (mass and momentum loss) and substrate erosion/entrainment (mass and momentum gain) in the granular flowing layer.
Teaser Project 2: Understanding the role of particle shape on the rheology of geophysical flows
Objective: Our ultimate goal is to improve the physics underlying current hazard assessments, particularly for processes such as pyroclastic density currents, debris flows, landslides, mudflows, and turbidity currents. We aim to better understand how these flows shape landscapes and become so destructive. Natural particles in geophysical systems have complex, non-spherical shapes, but they are often approximated as spheres for simplicity. However, experimental observations reveal that the rheology of spherical and non-spherical granular media can differ significantly. The aim of this project is to implement a glued-sphere model into the open-source GPU-based flow solver, MFIX-Exa, which uses DEM. DEM is a massively parallel DEM solver capable of simulating a wide range of granular flow problems. This code is ideally suited as the foundation for implementing non-spherical DEM into MFIX-Exa.
Methods: This project employs DEM-CFD simulations of granular flows to investigate the role of particle non-sphericity in driving the remarkable complexity of geophysical flows. Once the glued-sphere approach is implemented, we will systematically explore the effects of particle shape (e.g., roundness, sphericity) and examine its impact on flow rheology across quasi-static, intermediate, and inertial flow regimes (ranging from gas-like to solid-like behaviours). Discrete simulation data will be transformed into a continuum framework using a coarse-graining code. This will enable us to describe the influence of particle shape on granular flow rheology for integration into reduced-order numerical models of geophysical flows.
References & Further Reading
- Musser, J., Almgren, A. S., Fullmer, W. D., Antepara, O., Bell, J. B., Blaschke, J., … & Syamlal, M. (2022). MFIX-Exa: A path toward exascale CFD-DEM simulations. The International Journal of High Performance Computing Applications, 36(1), 40-58
- Lu, L., Gao, X., Shahnam, M., & Rogers, W. A. (2021). Simulations of biomass pyrolysis using glued-sphere CFD-DEM with 3-D intra-particle models. Chemical Engineering Journal, 419, 129564
- Exascale Project: MFIX-EXA
- NREL/BDEM
-
Multi-scale modelling of volcanoes and their deep magmatic roots: fluid release from subvolcanic magma bodies
Project institution:University of GlasgowProject supervisor(s):Dr Tobias Keller (University of Glasgow), Prof Andrew McBride (University of Glasgow), Prof Jin Sun (University of Glasgow) and Dr Ankush Aggarwal (University of Glasgow)Overview and Background
This PhD studentship focuses on developing GPU-accelerated models of magmatic processes that underpin volcanic hazards and magmatic resource formation. These processes span sub-millimeter mineral-melt-fluid interactions up to kilometer-scale magma dynamics and crustal deformation. Magma is a multi-phase mixture of solids, silicate melts, and volatile-rich or other fluids, interacting in complex thermo-chemical-mechanical ways. The project will contribute one component of a hierarchical, multi-scale modelling framework using advanced GPU-based techniques. In this project, you will focus on developing a system-scale model of fluid exsolution and extraction from a crystallising magma body with implications for volcanic unrest preceding eruptions and the genesis of magmatic-hydrothermal deposits of critical metal ore.
Your work will include software development, integrating and interpreting field and experimental data sets, attending regular seminars, collaborating within a research team, and receiving training through ExaGEO workshops.
Volcanic eruptions originate from shallow crustal magma reservoirs built up over long periods. As magma cools and crystallises, it releases fluid phases—aqueous, briny, or containing carbonates, metal oxides, or sulfides—whose low viscosity and pronounced density contrasts drive fluid segregation. This fluid migration can trigger volcanic unrest or concentrate metals into economically valuable deposits. The micro- to meso-scale distribution of fluids—discrete droplets versus interconnected drainage networks—crucially depends on crystal and melt properties. Direct observations are challenging, so high-resolution, GPU-accelerated simulations provide a way to understand these complex and dynamic systems.
Methodology and Objectives
Modelling volcanic systems is challenging due to the multi-scale nature of their underlying physical and chemical processes. System-scale dynamics (100 m to 100 km) emerge from interactions involving crystals, melt films, and fluid droplets or channels on micro- to centimetre scales. To link these scales, this project uses a hierarchical approach: (i) direct numerical simulations of granular-scale phase interactions, (ii) deep learning-based computational homogenisation to extract effective constitutive relations, and (iii) system-scale mixture continuum models applying these relations to problems. All components leverage GPU-accelerated computing and deep learning to handle direct simulations at local scales, train effective constitutive models, and achieve sufficient resolution at the system scale.
In this project the candidate will develop and apply a novel system-scale three-phase flow model informed by effective constitutive models derived from granular-scale simulations and computational homogenisation. The model will build on a recent multi-phase reaction-transport theory framework [1] and numerical treatment [2] which will be implemented in a GPU-accelerated algorithm built on cutting edge Julia packages [3]. To inform system-scale reaction and transport rates the simulations will utilise constitutive models derived from granular-scale direct simulations and computational homogenisation (delivered by partner projects). The simulations will be used to systematically investigate the role of a regime transition in the transport of fluid phases through subvolcanic magma bodies from disconnected bubble migration to interconnected channelised drainage [4].
Within this framework, the student will start by working on two “teaser” projects to gain familiarity with different techniques and data, then choose how to further develop and focus their research.
Teaser Project 1:
This teaser project, conducted over the first year, will focus on the implementation of a mechanical three-phase (solid+liquid+liquid/vapour) transport model using GPU-accelerated algorithms in Julia. The implementation will follow from previous work demonstrated in a serial Matlab prototype [2] and use a staggered-grid finite-difference method to discretise the set of underlying PDEs in combination with a matrix-free iterative solution approach demonstrated to be highly efficient when run on massively parallel GPU infrastructure. The model will be structured such that it can switch between using traditional constitutive models as analytical functions of system variables [1,2] and querying trained neural nets which output flux and transfer rates given gradients and phase deviations in system variables.
Teaser Project 2:
This teaser project, conducted over the first year, will focus on implementing and calibrating a multi-component petrological model to represent fractional crystallisation and fluid exsolution in a chosen volcanic context (to be determined). The petrological model will follow an approach utilising thermodynamics-inspired fitting functions [5] and will be calibrated against the output of an energy-minimising thermodynamic equilibrium solver [6] using machine learning tools. This established approach for formulating an approximate but robust and efficient form of phase equilibrium model will be compared to a novel approach of training a neural network to take pressure, temperature, and element composition of magmatic materials and returning the proportions and compositions of stable phase assemblages.
References and Further Reading
- Keller, T. and Suckale, J., 2019. A continuum model of multi-phase reactive transport in igneous systems. Geophysical Journal International, 219(1), pp.185-222
- Wong, Y.Q. and Keller, T., 2023. A unified numerical model for two-phase porous, mush and suspension flow dynamics in magmatic systems. Geophysical Journal International, 233(2), pp.769-795
- ParallelStencil; ImplicitGlobalGrid
- Degruyter, W., Parmigiani, A., Huber, C. and Bachmann, O., 2019. How do volatiles escape their shallow magmatic hearth?. Philosophical Transactions of the Royal Society A, 377(2139), p.20180017
- Riel, N., Kaus, B.J., Green, E.C.R. and Berlie, N., 2022. MAGEMin, an efficient Gibbs energy minimizer: application to igneous systems. Geochemistry, Geophysics, Geosystems, 23(7), p.e2022GC010427
-
Uncertainty determination and visualisation of volcanic co-PDC ash plume dispersal
Project institution:Lancaster UniversityProject supervisor(s):Dr Thomas Jones (Lancaster University), Dr Frances Beckett (UK Met Office), Charlie Bates (UK Met Office) and Professor Mike James (Lancaster University)Overview and Background
Substantial progress has been made in modelling dispersion of volcanic plumes from explosive eruptions, but plumes formed from pyroclastic density currents (i.e., co-PDC plumes) have been largely neglected. They comprise fine-grained ash particles and hot gas, can reach heights of tens of kilometres, potentially dispersing large volumes of ash over continental scale areas, impacting the environment, and posing a risk to aviation. This project, alongside the Met Office, will quantify the uncertainties of modelling co-PDC ash dispersion using NAME (Numerical Atmospheric-dispersion Modelling Environment), which is used to generate forecasts for the London Volcanic Ash Advisory Centre. You will also construct workflows that can post-process NAME outputs from large, ensemble runs, into graphics/forecasts for multiple end-users (e.g., aviation industry, meteorologists, research scientists) at operational speed.
Methodology and Objectives
Ensemble forecasting; Monte Carlo analysis; Numerical Weather Prediction models; Lagrangian and Eulerian particle dispersion models (NAME); Parallel computing; probability mapping; JASMIN; High Performance Computing.
Teaser Project 1: Evaluating uncertainties associated with co-PDC ash dispersion
The Met Office is home to the London Volcanic Ash Advisory Centre (VAAC). The role of the London VAAC is to provide advice, forecasts and guidance to the aviation authorities on the presence of volcanic ash in the atmosphere, especially for eruptions originating in Iceland. Going forward, the VAACs will be required to issue quantitative and probabilistic volcanic ash concentration information which incorporate uncertainties in both the weather data and the eruption source parameters (e.g., mass eruption rate, plume height, particle size, shape). Currently, the UK Met Office uses the Numerical Atmospheric-dispersion Modelling Environment (NAME) to provide operational forecasts based on a single set of metrological data and for a specific set of eruption source parameters. However, to present probabilistic outputs or outputs with quantitative estimates of uncertainty, ensemble or Monte Carlo runs are required. This increases computational time and cost, which need to be minimised for real-time operational forecasting during an emergency.
In this teaser project you will address this upcoming challenge for the specific case of co-PDC ash plumes. You will develop code to use ensemble metrological data with NAME for co-PDC plumes. You will optimise this approach on high performance computing infrastructure (e.g., JASMIN) such that it can be used over timescales appropriate for real-time eruption response. This teaser project could be further developed by exploring the uncertainty in eruption source parameters that are unique to co-PDCs (e.g., vent location, aspect ratio, particle shape) and, if time, expansion to other types of volcanic eruption.
Teaser Project 2: Visualisation and interaction with large, complex ash dispersion datasets
Due to the real-time operational nature of the London VAAC, individual model runs can be executed within minutes; however, they generate large datasets that need to be post-processed and visualised quickly (e.g., ash concentration at different flight levels, total plume mass loadings, embedded wind fields and precipitation data, all as a function of time since the eruption). With the growing use of ensemble model forecasts these datasets are expected to grow by several orders of magnitude in the coming years. Thus, generating computationally cheap methods to post-process these big-data and provide effective visualization presents a key challenge. In this teaser project you will use a set of ensemble volcanic ash dispersion data (i.e., big data set) and develop a set of parallel workflows and robust data structures to post-process these data and display the required VAAC graphics whilst minimizing computational time. This teaser project could be further developed by tailoring these workflows and outputs to different end-users (e.g., aviation industry, meteorologists, research scientists), who will have different requirements and different knowledge bases for data visualisation and analysis.
References and Further Reading
- Jones, T.J., Beckett, F., Bernard, B., Breard, E.C., Dioguardi, F., Dufek, J., Engwell, S. and Eychenne, J., 2023. Physical properties of pyroclastic density currents: relevance, challenges and future directions. Frontiers in Earth Science, 11, p.1218645
- Madankan, R., Pouget, S., Singla, P., Bursik, M., Dehn, J., Jones, M., Patra, A., Pavolonis, M., Pitman, E.B., Singh, T. and Webley, P., 2014. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion. Journal of Computational Physics, 271, pp.39-59
- Leadbetter, S.J., Jones, A.R. and Hort, M.C., 2022. Assessing the value meteorological ensembles add to dispersion modelling using hypothetical releases. Atmospheric Chemistry and Physics, 22(1), pp.577-596
- Capponi, A., Harvey, N.J., Dacre, H.F., Beven, K., Saint, C., Wells, C. and James, M.R., 2022. Refining an ensemble of volcanic ash forecasts using satellite retrievals: Raikoke 2019. Atmospheric Chemistry and Physics, 22(9), pp.6115-6134
- Beckett, F., Barsotti, S., Burton, R., Dioguardi, F., Engwell, S., Hort, M., Kristiansen, N., Loughlin, S., Muscat, A., Osborne, M. and Saint, C., 2024. Conducting volcanic ash cloud exercises: practising forecast evaluation procedures and the pull-through of scientific advice to the London VAAC. Bulletin of Volcanology, 86(7), p.63
- Beckett, F.M., Witham, C.S., Hort, M.C., Stevenson, J.A., Bonadonna, C. and Millington, S.C., 2015. Sensitivity of dispersion model forecasts of volcanic ash clouds to the physical characteristics of the particles. Journal of Geophysical Research: Atmospheres, 120(22), pp.11-636
- Met Office Dispersion Model
- JASMIN
Projects with a focus on Sustainability Solutions in Engineering, Environmental, and Social Sciences:
-
AI Meets Glasgow’s Trees: Metrics Prediction, 3D Mapping, and Socio-Ecosystem Impact Simulations
Project institution:University of GlasgowProject supervisor(s):Dr Meiliu Wu (University of Glasgow), Dr Davide Dominoni (University of Glasgow), Dr Luigi Cao Pinna (University of Glasgow), Dr Dominic McCafferty (University of Glasgow), Dr Alex Bush (Lancaster University), Doug McNeil (EOLAS Insights Ltd) and Gillian Dick (Glasgow City Council)Overview and Background
The project aims to harness the power of AI to advance sustainable forestry and woodland management in Glasgow, focusing on predicting tree metrics (e.g., species, height, canopy area, and tree health), creating a detailed and interactive 3D tree map, and assessing tree impacts on socio-ecosystem. This research will combine cutting-edge deep learning techniques and statistical inference with diverse data sources, including high-resolution LiDAR data, remote sensing imagery, street view images, environmental sensor data, citizen-science text, and social media. By leveraging exascale GPU computing, the project will develop scalable, real-time models and tools to support sustainable urban planning, biodiversity conservation, and climate resilience efforts.
The project will comprise two teaser projects: (1) AI-powered predicting tree metrics, and developing 3D visualisation of Glasgow’s forestry and woodland; and (2) socio-ecosystem simulations to evaluate trees’ impacts on biodiversity, climate mitigation, and human well-being. These components will enable researchers, city planners, and policymakers to make data-driven decisions for a greener, more sustainable Glasgow.
Teaser Project 1: AI for Glasgow’s Greenness: Predicting and Visualising Tree Metrics with 3D Insights
The objective is to develop advanced AI models that predict key metrics of Glasgow’s trees, including species, height, canopy area, and tree health, and visualise these metrics on an interactive 3D map.
Data Sources:
- Ground truth data: Collect and label ground truth data for tree species, height, canopy area, and age.
- LiDAR and Remote Sensing (RS): High-resolution 3D point clouds and spectral imagery.
- Street View Images: Ground-level visual perspectives of trees.
- Citizen-Science Text and Social Media: Descriptive keywords and local observations about tree conditions.
Models and Approach:
- In Year 1 you will train or fine-tune deep learning models, such as convolutional neural networks (CNNs), for image and LiDAR data analysis to predict tree height, canopy area, and age.
- In Year 2 you will fine-tune vision-language models (VLMs, e.g., OpenAI’s CLIP[1] and GPT-4[2]) to seamlessly integrate visual inputs (e.g., remote sensing imagery, street view photos, citizen-science images, and social media visuals) with textual data (e.g., citizen-science descriptions and social media posts), to predict tree species and assess tree health (Wu et al., 2023; Wu & Huang, 2022). Specifically, pre-trained VLMs will be fine-tuned using image-text pairs as input. These pairs will consist of images associated with species IDs from citizen-science databases (e.g., Treezilla, GBIF, and iNaturalist), supplemented with collaboratively sourced community data on Glasgow trees, including their health and overall condition.
- In Year 3 you will map the collected and predicted attributes of Glasgow’s forestry and woodland as a web-based 3D visualisation.
Outputs:
- A detailed, fine-grained, and interactive 3D map of Glasgow’s trees, showing tree metrics (e.g., height, canopy area, species, health, and diversity), serving as a visualisation platform for city planners, ecologists, and researchers.
Teaser Project 2: Assessing Socio-Ecological Benefits of Glasgow’s Trees: AI-Powered Semantic Insights and Simulations
The objective is to measure the socio-ecological benefits of Glasgow’s trees, focusing on public sentiment, biodiversity, and climate resilience using AI-powered simulation models and semantic analysis.
Data Sources:
- Tree data: location, canopy cover, height, etc.
- Biodiversity data (e.g., insect, bird and mammal data collected at 40 sites in Glasgow from NERC GALLANT project).
- Environmental sensor data (e.g., temperature and air quality).
- Natural hazard data (e.g., flood-prone areas).
- Social media data and citizen-science descriptive text.
Models and Approach:
- Tree indices: Measure tree diversity and abundance/biomass indices by area/zone.
- Semantic Analysis: Use large language models (e.g., BERT3) and statistical topic modelling methods (e.g., Latent Dirichlet Allocation (LDA)) to analyse public sentiment and attitudes toward urban trees from citizen-science text and social media posts (e.g., from Flickr and Twitter) (e.g., Cao Pinna et al., 2024).
- AI-powered Simulation: Associating Tree Indices with Environmental Variables
- Cooling Effects: By integrating tree attributes, land surface temperatures, and urban heat island patterns, deep learning models (e.g., CNNs) can predict temperature reductions at fine spatial-temporal scales and identify optimal tree placement.
- Flood Mitigation: By using time-series rainfall data, deep learning techniques (e.g., RNNs) can predict and analyse how tree roots, soil permeability, and canopy interception influence stormwater absorption and surface runoff, and evaluate how tree planting strategies mitigate flood risks.
- Carbon Sequestration: Deep learning models (e.g., reinforcement learning) can leverage tree growth data, biomass estimates, and carbon flux measures to predict carbon sequestration rates, by identifying patterns of carbon uptake across diverse tree species and environmental conditions.
- AI-enhanced Impact Assessment of Tree Abundance, Diversity, and Species
- Tree Abundance and Diversity: Deep learning techniques (e.g., unsupervised clustering) can identify patterns in tree abundance (e.g., total biomass, canopy area) and diversity (species richness, evenness) that promote insect diversity.
- Role of Specific Tree Species: Deep learning-powered species interaction models, such as graph neural networks (GNNs), can identify keystone tree species that sustain insect and bird populations across trophic levels.
- Uncertainty Measurement: Develop ensemble models to quantify uncertainties in predictions and simulate multiple future scenarios.
Outputs:
- Year 1: Semantic maps visualising public sentiment across the city.
- Year 2: Predictive simulations of urban trees’ roles in mitigating urban heat islands, reducing flood risks, enhancing air quality, and improving urban biodiversity in insects and birds.
- Year 3: Interactive dashboards for policymakers to evaluate tree-ecosystem trade-offs and benefits. For example, similar to NYC Tree Map, the dashboard could display ecological benefits: (1) Energy conserved each year (kWh); (2) Stormwater intercepted each year (gallons); and (3) Air pollutants removed each year (pounds).
References and Further Reading
- Cao Pinna, L., Miller, C., & Scott, M. (2024). Latent Dirichlet allocation and hidden Markov models to identify public perception of sustainability in social media data. International Workshop on Statistical Modelling, 14–20
- Wu, M., & Huang, Q. (2022). IM2City: image geo-localization via multi-modal learning. Proceedings of the 5th ACM SIGSPATIAL International Workshop on AI for Geographic Knowledge Discovery, 50–61
- Wu, M., Huang, Q., Gao, S., & Zhang, Z. (2023). Mixed land use measurement and mapping with street view images and spatial context-aware prompts via zero-shot multimodal learning. International Journal of Applied Earth Observation and Geoinformation, 125, 103591
- AI initiatives in silviculture, especially concerning woodland health detection (click here)
- Glasgow City Council’s forestry and woodland strategy (click here)
-
GPU-Accelerated High-Fidelity Hydrodynamics Modelling for Tidal Energy Resource and Environmental Impact Assessment
Project institution:University of EdinburghProject supervisor(s):Dr Joseph O’Connor (University of Edinburgh), Dr Brian Sellar (University of Edinburgh) and Dr Athanasios Angeloudis (University of Edinburgh)Overview and Background
Tidal energy offers a predictable and sustainable energy source, driving increased interest in its development. However, as tidal energy deployments are scaled up, this places a greater burden on the local environment/ecosystem. High-fidelity hydrodynamic modelling tools are essential for predicting and mitigating environmental impacts, while also maximising energy extraction, to ensure this limited resource is used in a responsible way. However, these models are computationally demanding. Moreover, many existing tools are developed for traditional (CPU-based) HPC systems. With the advent of large GPU-based exascale machines, there is a need to prepare existing codes for this new HPC paradigm. As well as futureproofing existing codebases, this will provide a step change in computational capacity, unlocking new types of simulations (e.g. high-fidelity multi-physics) and workflows (e.g. optimisation, uncertainty quantification, data assimilation) not currently possible with today’s methods.
Methodology and Objectives
Both projects will use advanced computational techniques to improve hydrodynamic modelling capability for tidal energy resource assessment, as well as predicting/mitigating environmental impacts. This will involve porting existing open-source hydrodynamic modelling tools to GPU to enable faster, larger, and more detailed simulations/workflows than what is currently achievable. The focus is specifically on high-fidelity models, where the 3D hydrodynamic equations are solved numerically. Initially, this project will focus on tools already used throughout the supervisory team (e.g. TELEMAC-3D, Thetis). However, a survey period is envisaged to identify the most suitable tools to take forward to exascale applications.
Teaser Project 1: Accelerating high-fidelity hydrodynamic models for large-scale ensemble-based workflows
In most real-world applications, running a single simulation is often insufficient. Tasks such as optimisation and uncertainty quantification typically require hundreds/thousands of model evaluations. This is extremely computationally demanding, making it impractical with today’s high-fidelity methods. This project will enable these types of workflows on emerging GPU-based exascale machines by porting an existing high-fidelity hydrodynamics model to GPU. The objectives for the teaser project are:
- Survey and profile existing high-fidelity hydrodynamic modelling tools (starting with TELEMAC-3D and Thetis) to identify computational bottlenecks and evaluate the potential for porting to GPU.
- Build a proxy application replicating one of the identified bottlenecks (e.g. advection-diffusion equation) to test and evaluate different programming models/frameworks for porting to GPU (e.g. CUDA, OpenMP, SYCL).
Following the first year, if this project is selected the remaining objectives will be to:
- Port selected components of the chosen model to GPU to improve computational performance. This will involve implementing kernels for GPU execution and optimising memory management.
- Benchmark the GPU-accelerated implementation to compare the performance against the existing CPU implementation and identify areas for optimisation. This will also involve testing the scalability across multiple GPUs for large-scale exascale applications.
- Integrate the GPU-accelerated model within a large-scale ensemble-based framework to enable workflows that require hundreds/thousands of model evaluations. This will be demonstrated on real-world cases by performing large-scale optimisation (e.g. for marine spatial planning) and uncertainty quantification (e.g. for model reliability) campaigns.
Teaser Project 2: Accelerating data assimilation for enhanced model calibration and predictive capability
Data assimilation (DA) combines real-world observational data with numerical simulations to enhance model predictions, leading to more reliable simulations for real-world problems. However, DA is computationally intensive, requiring sophisticated large-scale modelling and data processing techniques. This project will develop a GPU-accelerated DA framework tailored for tidal resource assessment to enable these workflows on emerging GPU-based exascale machines. The objectives for the teaser project are:
- Survey existing methods (e.g. 3D/4D variational DA, ensemble Kalman filter) and libraries for combining observational data with high-fidelity hydrodynamic models. This should also consider the form of the observational data (e.g. satellite, acoustic doppler current profiler, etc.).
- Build and profile a small CPU-based example of the DA framework to identify computational bottlenecks and determine priority components for porting to GPU (e.g. model, data processing, or a mix of both).
Following the first year, if this project is selected the remaining objectives will be to:
- Port selected components of the DA framework to GPU to improve computational performance. This will involve selecting a suitable programming model/framework, implementing kernels for GPU execution, and optimising memory management.
- Benchmark and profile the GPU-accelerated implementation to compare the performance against the existing CPU implementation, as well as identifying areas for further development and optimisation.
- Demonstrate the new GPU-accelerated DA framework on real-world tidal energy applications (e.g. resource assessment and environmental impact). This will enable enhanced model calibration for uncertain parameters (e.g. bottom friction, turbulence parameters), as well as improve predictive accuracy by optimally combining model solutions with observational data.
References and Further Reading
- Almoghayer, M. A., Lam, R., Sellar, B., Old, C., & Woolf, D. K. (2024). Validation of tidal turbine wake simulations using an open regional-scale 3D model against 1MW machine and site measurements. In Ocean Engineering (Vol. 299, p. 117402). Elsevier BV (click here)
- Old, C., Sellar, B., & Angeloudis, A. (2024). Iterative dynamics-based mesh discretisation for multi-scale coastal ocean modelling. In Journal of Ocean Engineering and Marine Energy (Vol. 10, Issue 2, pp. 313–334). Springer Science and Business Media LLC (click here)
- TELEMAC-MASCARET
- Thetis
