Building a better climate model with Machine Learning | AISC

Published on ● Video Link: https://www.youtube.com/watch?v=IVDus4XrhS0



Duration: 50:29
483 views
9


For slides and more information on the paper, visit https://ai.science/e/machine-learning-super-resolution--jFykgyJIuJaZEuhWxIMD

Speaker: Noah Brenowitz; Host: Peetak Mitra

Motivation:
Dr. Noah Brenowitz will discuss his research in building super-resolution models for climate applications, especially related to accurately modeling physical processes relevant for Global Circulation Models (GCM).

Many small-scale and complex physical processes in general circulation models (GCMs) cannot be explicitly resolved due to limited computational resources. Processes on scales smaller than the spatial resolution of the model need to be parameterized. Parameterizations have been known to be major sources of uncertainties in GCMs, and various approaches have been proposed to deduce the influence of the under-resolved and unresolved processes.

Generative adversarial networks (GANs) are a class of unsupervised machine learning methods that can generate realistic data from a target distribution. They are well-suited to build emulators for complex physical processes, and hence poised to serve as building blocks for parameterizations. Super-resolution GAN (SRGAN) and its variants were introduced in recent years for obtaining photo-realistic images using a novel loss function, which is a weighted sum of adversarial loss and pixel-to-pixel content loss.
We develop a data-driven approach using SRGAN and its variants drawing parallels from the development of super-parameterized CAM (SP-CAM). For simplicity and model consistency, the GANs are trained using cloud resolving model (CRM) outputs from the near-global CRM simulations ( https://doi.org/10.1002/2015MS000499 ), with the input distribution being a low-resolution coarse-grained version of the original high-resolution CRM data. The GAN aims to reconstruct the original high-resolution CRM data. We test the performance of these GANs using several reconstruction losses, including some motivated by physical constraints of importance to the domain of cloud physics. Our results show that these GANs are able to produce realistic high-resolution data from their low-resolution counterparts, whilst satisfying some of the physical constraints. Our next step is to incorporate physical constraints more rigorously into the training and inference of these GANs, so they may be used for constructing realistic subgrid scale parameterizations for convection.




Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE


2020-09-23explainX - Explainable AI for model developers | AISC
2020-09-22Statistical Issues in Agent-Based Models | AISC
2020-09-22Layerwise Learning for Quantum Neural Networks | AISC
2020-09-17Survival regression with AFT model in XGBoost | AISC
2020-09-17Detecting Off-Topic Spoken Response with NLP | AISC
2020-09-16Defining your AI Value Model for Product Success (and Profit) | AISC
2020-09-15Real-World Quantum Communication: One Module at a Time | AISC
2020-09-15Predicting and Understanding Human Choices using PCMC-Net with an application to Airline Itineraries
2020-09-14Product Ideation: From a Hunch to a Concrete Idea
2020-09-14RadioAssistant - Ranking Radiology Patients using Deep Learning | Workshop Capstone
2020-09-11Building a better climate model with Machine Learning | AISC
2020-09-10Set Constrained Temporal Transformer for Set Supervised Action Segmentation | AISC
2020-09-10An overview of task-oriented dialog systems | AISC
2020-09-09Targeted Machine Learning for Data Science | AISC
2020-09-08Build next generation recommenders with NVIDIA Merlin | AISC
2020-09-02Principal Neighbourhood Aggregation for Graph Nets | AISC
2020-09-01DeepFakes & Explainable AI Applications in NLP, Biomedical & Malware Classification
2020-08-28AI Ethics Then & Now: A Look Back on the Last Five Years | AISC
2020-08-27Beyond Accuracy: Behavioral Testing of NLP Models with CheckList | AISC
2020-08-27The Summary Loop: Learning to Write Abstractive Summaries Without Examples + Demo | AISC
2020-08-26[MEM] Learning Permutation Invariant Representations using Memory Networks | AISC