Relampago_log

Convection initiation in Sierras de Cordoba


Overview

We will look at the 2019-01-29 convective storm near the Sierras de Cordoba

Prerequisites

Concepts

Importance

Notes

Intro to Cartopy

Necessary

Understanding of NetCDF

Helpful

Familiarity with metadata structure

  • Time to learn: 10 minutes


Imports

Let’s import our required libraries

import act
import xarray as xr
import cmweather
import matplotlib.pyplot as plt
import numpy as np
import pyart
import hvplot.xarray
import holoviews as hv
import glob
import xradar as xd
import cartopy.crs as ccrs
from dask.distributed import Client, LocalCluster
from functools import partial
hv.extension("bokeh")
## You are using the Python ARM Radar Toolkit (Py-ART), an open source
## library for working with weather radar data. Py-ART is partly
## supported by the U.S. Department of Energy as part of the Atmospheric
## Radiation Measurement (ARM) Climate Research Facility, an Office of
## Science user facility.
##
## If you use this software to prepare a publication, please cite:
##
##     JJ Helmus and SM Collis, JORS 2016, doi: 10.5334/jors.119

Dask Cluster

Let’s spin up our local cluster using Dask

client = Client(LocalCluster())

Data acquisition

Then, we can use the Atmospheric Data Community Toolkit ACT to easily download the data. The act.discovery.download_arm_data module will allow us to download the data directly to our computer. Watch for the data citation! Show some support for ARM’s instrument experts and cite their data if you use it in a publication.

username = "XXXXXX"  ## Use your ARM username
token = "XXXXX" ### Use your ARM token (https://sso.arm.gov/arm/login?service=https%3A%2F%2Fadc.arm.gov%2Farmlive%2Flogin%2Fcas)

We are going to analyze data from the C-band Scanning ARM Precipitation Radar CSARP and the Ka-band ARM Zenith Radar KZAR. Data from both sensors can be found at the ARM Data Discovery portal under the porta data tab. Then, we can filter data by date, sensor, and field campaign. Once the desired data is located, we can get the identification. In our case, for the CSARP will be corcsapr2cmacppiM1.c1 and for KZAR corarsclkazr1kolliasM1.c1

# Set the datastream and start/enddates
kzar = 'corarsclkazr1kolliasM1.c1'
csarp = "corcsapr2cmacppiM1.c1"
startdate = '2019-01-29T13:00:00'
enddate = '2019-01-29T20:00:00'

To examine the CACTI domain, we’ll read in one timestep from the LASSO simulations and plot the terrain.

path_staging = "/data/project/ARM_Summer_School_2024_Data/lasso_tutorial/cacti/lasso-cacti"  # path on Jupyter
file_list = sorted(glob.glob(f'{path_staging}/20190129/eda09/base/les/subset_d3/corlasso_met_*'))

ds = xr.open_dataset(file_list[50])
ds
<xarray.Dataset> Size: 6GB
Dimensions:               (Time: 1, south_north: 865, west_east: 750,
                           bottom_top: 149)
Coordinates:
  * Time                  (Time) datetime64[ns] 8B 2019-01-29T18:30:00
    XLONG                 (south_north, west_east) float32 3MB ...
    XLAT                  (south_north, west_east) float32 3MB ...
    XTIME                 (Time) float32 4B ...
Dimensions without coordinates: south_north, west_east, bottom_top
Data variables: (12/52)
    ITIMESTEP             (Time) int32 4B ...
    MUTOT                 (Time, south_north, west_east) float32 3MB ...
    HGT                   (Time, south_north, west_east) float32 3MB ...
    HAMSL                 (Time, bottom_top, south_north, west_east) float32 387MB ...
    P_HYD                 (Time, bottom_top, south_north, west_east) float32 387MB ...
    PRESSURE              (Time, bottom_top, south_north, west_east) float32 387MB ...
    ...                    ...
    MULFC                 (Time, south_north, west_east) float32 3MB ...
    MULNB                 (Time, south_north, west_east) float32 3MB ...
    MULPL                 (Time, south_north, west_east) float32 3MB ...
    MUCAPE                (Time, south_north, west_east) float32 3MB ...
    MUCIN                 (Time, south_north, west_east) float32 3MB ...
    REFL_10CM_MAX         (Time, south_north, west_east) float32 3MB ...
Attributes: (12/39)
    DX:                          500.0
    DY:                          500.0
    SIMULATION_START_DATE:       2019-01-29_06:00:00
    WEST-EAST_GRID_DIMENSION:    751
    SOUTH-NORTH_GRID_DIMENSION:  866
    BOTTOM-TOP_GRID_DIMENSION:   150
    ...                          ...
    doi_isPartOf_lasso-cacti:    https://doi.org/10.5439/1905789
    doi_isDocumentedBy:          https://doi.org/10.2172/1905845
    doi_thisFileType:            https://doi.org/10.5439/1905819
    history:                     processed by user d3m088 on machine cirrus28...
    filename_user:               corlasso_met_2019012900eda09d3_base_M1.m1.20...
    filename_storage:            corlassomet2019012900eda09d3baseM1.m1.201901...

We’ll also plot the location of the CSAPR radar using the metadata from one of the radar files

files = sorted(glob.glob('corcsapr2cmacppiM1.c1/*'))
radar = xd.io.open_cfradial1_datatree(files[-6], first_dim="auto")
radar = radar.xradar.georeference()
display(radar)
<xarray.DatasetView> Size: 740B
Dimensions:              (sweep: 15)
Dimensions without coordinates: sweep
Data variables:
    sweep_group_name     (sweep) <U10 600B 'sweep_0.0' ... 'sweep_14.0'
    sweep_fixed_angle    (sweep) float32 60B ...
    latitude             float32 4B ...
    longitude            float32 4B ...
    altitude             float32 4B ...
    time_coverage_start  |S32 32B ...
    time_coverage_end    |S32 32B ...
    volume_number        int32 4B ...
Attributes:
    version:      2.0 lite
    history:      created by zsherman on or-condo-c215.ornl.gov at 2020-10-01...
    references:   See CSAPR2 Instrument Handbook
    institution:  United States Department of Energy - Atmospheric Radiation ...
    source:       Atmospheric Radiation Measurement (ARM) program C-band Scan...
    Conventions:  CF/Radial instrument_parameters ARM-1.3
    comment:      This is highly experimental and initial data. There are man...

CACTI Domain Overview

CSAPR_lat = float(radar.latitude.values)
CSAPR_lon = float(radar.longitude.values)

cf = plt.contourf(ds.XLONG, ds.XLAT, ds.HGT[0,:,:], cmap='terrain', levels=np.arange(0, 2500, 50))
plt.scatter(CSAPR_lon, CSAPR_lat, s=50, c='tab:red', label='ARM Site')
plt.title('CACTI Domain', fontsize=15)
plt.colorbar(cf, label='Height (m)')
plt.legend()
<matplotlib.legend.Legend at 0x7fa060a4c2d0>
../_images/25a168596fffda04cdf1050a9e426c69326970914a8b6c00b9977c51da86b1c3.png

Now let’s examine the convection occurring during this particular day. To do this, we’ll use one of the radar files we imported above.

vel_plot = radar['sweep_0']["mean_doppler_velocity"].hvplot.quadmesh(x="x", y="y", cmap="balance", 
                                                          height=300, width=400, 
                                                          clim=(-20, 20),
                                                          rasterize=True)
ref_plot = radar['sweep_0']["reflectivity"].hvplot.quadmesh(x="x", y="y", cmap="ChaseSpectral", 
                                                          height=300, width=400, 
                                                          clim=(-20, 70),
                                                          rasterize=True)
ref_plot+vel_plot

We can see that all of the storms seem to be forming along one particular area. It looks like this might be the mountain range in the center of the CACTI domain shown above, but let’s overlay terrain on the radar image to verify this.

proj_crs = xd.georeference.get_crs(radar["sweep_0"].ds)
cart_crs = ccrs.Projection(proj_crs)
fig = plt.figure(figsize=(9, 9))
ax = fig.add_subplot(111, projection=ccrs.PlateCarree())
radar["sweep_0"]["reflectivity"].plot(
    x="x",
    y="y",
    cmap="ChaseSpectral",
    transform=cart_crs,
    cbar_kwargs=dict(pad=0.075, shrink=0.75),
    vmin=-20,
    vmax=70
)
ax.coastlines()
ax.gridlines(draw_labels=True)

ax.set_xlim([-66,-63.5])
ax.set_ylim([-33.2, -31])

cf = plt.contour(ds.XLONG, ds.XLAT, ds.HGT[0,:,:], cmap='terrain', levels=np.arange(0, 2500, 350), 
                 transform=ccrs.PlateCarree())
/data/project/jupyterhub/notebook-envs/arm-summer-school-2024-env/lib/python3.11/site-packages/cartopy/mpl/geoaxes.py:1768: UserWarning: The input coordinates to pcolormesh are interpreted as cell centers, but are not monotonically increasing or decreasing. This may lead to incorrectly calculated cell edges, in which case, please supply explicit cell edges to pcolormesh.
  result = super().pcolormesh(*args, **kwargs)
../_images/2bd99d3704bed93ff27eef8b95005a272a5ae296eb35daff313014125d78f54e.png

As we suspected, the storms are all located just east of the Sierras de Cordoba. This leads us to our hypothesis…

Science Question

How is deep convection initiation controlled by local and regional meteorological conditions (Thunderstorm life cycle)as well as geography?

Project Scope

  • Learning how to use LASSO simulation and ARM data.

  • Understand how convection is initiated in our domain by looking at measurements

  • Where convection is initiated in our domain of study

  • Comparison between observation and simulations

Hypothesis

  • Convection initiation near the ARM site is caused largely by orographic effects, surface heating, and advection of moisture from the Pacific.

Radar reflectivity plots

CSARP

We found in total 15 files for the 2021-01-29 deep convective storms as follows

files = sorted(glob.glob('corcsapr2cmacppiM1.c1/*'))
files[:3]
['corcsapr2cmacppiM1.c1/corcsapr2cmacppiM1.c1.20190129.130003.nc',
 'corcsapr2cmacppiM1.c1/corcsapr2cmacppiM1.c1.20190129.131503.nc',
 'corcsapr2cmacppiM1.c1/corcsapr2cmacppiM1.c1.20190129.133003.nc']

Data concatenation

An easy way to handle radar data is to combine or concatenate all files into a single dataset. To do this, we need to fix the azimuth angle to ensure the azimuth dimension is equal in all files. We also need to coarsen the azimuth angle, which will make merging across different volume scans easier. To do this we can use the following function

def fix_angle(ds):
    """
    Aligns the radar volumes
    """
    ds["time"] = ds.time.load()  # Convert time from dask to numpy

    start_ang = 0  # Set consistent start/end values
    stop_ang = 360

    # Find the median angle resolution
    angle_res = ds.azimuth.diff("azimuth").median()

    # Determine whether the radar is spinning clockwise or counterclockwise
    median_diff = ds.azimuth.diff("time").median()
    ascending = median_diff > 0
    direction = 1 if ascending else -1

    # first find exact duplicates and remove
    ds = xd.util.remove_duplicate_rays(ds)

    # second reindex according to retrieved parameters
    ds = xd.util.reindex_angle(
        ds, start_ang, stop_ang, angle_res, direction, method="nearest"
    )

    ds = ds.expand_dims("volume_time")  # Expand for volumes for concatenation

    ds["volume_time"] = [np.nanmin(ds.time.values)]

    return ds

Now we loop through all CSARP files just selecting the lowest elevation angle (sweep_0) and then fixing the angle. The resulting datasets will be store in a list to be concatenated later

dsets = []
for file in files:
    ds = xd.io.open_cfradial1_datatree(file)["sweep_0"].to_dataset()
    ds["azimuth"] = ds.azimuth.round(2)
    ds = fix_angle(ds)
    dsets.append(ds)

Now we concatenate them into a single big xr.Dataset

ds = xr.concat(dsets, dim='volume_time')
ds
<xarray.Dataset> Size: 3GB
Dimensions:                                                (volume_time: 27,
                                                            azimuth: 360,
                                                            range: 1100)
Coordinates:
  * range                                                  (range) float32 4kB ...
  * azimuth                                                (azimuth) float32 1kB ...
    time                                                   (volume_time, azimuth) datetime64[ns] 78kB ...
    elevation                                              (volume_time, azimuth) float32 39kB ...
    latitude                                               float32 4B -32.13
    longitude                                              float32 4B -64.73
    altitude                                               float32 4B 1.141e+03
  * volume_time                                            (volume_time) datetime64[ns] 216B ...
Data variables: (12/61)
    attenuation_corrected_differential_reflectivity        (volume_time, azimuth, range) float32 43MB ...
    attenuation_corrected_differential_reflectivity_lag_1  (volume_time, azimuth, range) float32 43MB ...
    attenuation_corrected_reflectivity_h                   (volume_time, azimuth, range) float32 43MB ...
    censor_mask                                            (volume_time, azimuth, range) int32 43MB ...
    classification_mask                                    (volume_time, azimuth, range) int32 43MB ...
    copol_correlation_coeff                                (volume_time, azimuth, range) float32 43MB ...
    ...                                                     ...
    rain_rate_A                                            (volume_time, azimuth, range) float64 86MB ...
    sweep_number                                           (volume_time) float64 216B ...
    sweep_fixed_angle                                      (volume_time) float32 108B ...
    sweep_mode                                             (volume_time) <U6 648B ...
    prt                                                    (volume_time, azimuth) float32 39kB ...
    nyquist_velocity                                       (volume_time, azimuth) float32 39kB ...

In order to create radial plots we need to add the georeference to our new radar dataset

ds = ds.xradar.georeference()
proj_crs = xd.georeference.get_crs(ds)
cart_crs = ccrs.Projection(proj_crs)

Finally we can create an hvplot animation as follows

ds.reflectivity.hvplot.quadmesh(x='x',
                                          y='y',
                                          cmap='pyart_ChaseSpectral',
                                          groupby='volume_time',
                                          widget_type="scrubber",
                                          widget_location="bottom",
                                          clim=(-20, 70),
                                          width=600,
                                          height=500,
                                          rasterize=True)

KZAR

# KZAR files
# kzar_files = act.discovery.download_arm_data(username, token, kzar, startdate, enddate)
# ds_kzar = act.io.read_arm_netcdf("corarsclkazr1kolliasM1.c1/corarsclkazr1kolliasM1.c1.20190129.000000.nc")
ds_kzar = xr.open_dataset("corarsclkazr1kolliasM1.c1/corarsclkazr1kolliasM1.c1.20190129.000000.nc")
display(ds_kzar)
<xarray.Dataset> Size: 878MB
Dimensions:                               (time: 21600, layer: 10, height: 596,
                                           radar_mode: 4)
Coordinates:
  * time                                  (time) datetime64[ns] 173kB 2019-01...
  * layer                                 (layer) int32 40B 0 1 2 3 4 5 6 7 8 9
  * height                                (height) float32 2kB 160.0 ... 1.80...
  * radar_mode                            (radar_mode) |S2 8B b'hi' ... b'pr'
Data variables: (12/33)
    base_time                             datetime64[ns] 8B ...
    time_offset                           (time) datetime64[ns] 173kB ...
    reflectivity_best_estimate            (time, height) float32 51MB ...
    qc_reflectivity_best_estimate         (time, height) int32 51MB ...
    reflectivity                          (time, height) float32 51MB ...
    qc_reflectivity                       (time, height) int32 51MB ...
    ...                                    ...
    minimum_detectable_reflectivity_flag  (time, height) float32 51MB ...
    reflectivity_saturation_flag          (time, height) float32 51MB ...
    instrument_availability_flag          (time) int16 43kB ...
    lat                                   float32 4B ...
    lon                                   float32 4B ...
    alt                                   float32 4B ...
Attributes: (12/18)
    command_line:                     idl -R -n kazrcfrarscl -n kazrcfrarsclc...
    Conventions:                      ARM-1.2
    process_version:                  vap-kazrcfrarscl-1.6-4.el7
    dod_version:                      arsclkazr1kollias-c1-4.0
    site_id:                          cor
    platform_id:                      arsclkazr1kollias
    ...                               ...
    radar_modes_in_use:               md ge
    maximum_clutter_height:           4000.0 m
    radar_operating_frequency_burst:      34.830 GHz
    radar_operating_frequency_chirp:      34.890 GHz
    doi:                              10.5439/1228768
    history:                          created by user malynn on machine node1...

As we are interested in the convective storm that occurred between the 16:00 and the 19:00 we can subset our dataset to only keep storm data

ds_kzar = ds_kzar.sel(time=slice('2019-01-29 16:00', '2019-01-29 19:00'))

We would like see how radar reflectivity looks like in both instrument Therefore, we can use hvplot to create interactive plots

ref = ds_kzar.reflectivity_best_estimate.hvplot.quadmesh(x="time", y="height", 
                                                         cmap="ChaseSpectral", 
                                                         clim=(-20, 40),
                                                         rasterize=True, 
                                                         width=1000, 
                                                         height=300)
vel = ds_kzar.mean_doppler_velocity.hvplot.quadmesh(x="time", y="height", 
                                                         cmap="pyart_balance", 
                                                         clim=(-20, 20),
                                                         rasterize=True, 
                                                         width=1000, 
                                                         height=300)
(ref + vel).cols(1)

Summary

Add one final --- marking the end of your body of content, and then conclude with a brief single paragraph summarizing at a high level the key pieces that were learned and how they tied to your objectives. Look to reiterate what the most important takeaways were.

What’s next?

Let Jupyter book tie this to the next (sequential) piece of content that people could move on to down below and in the sidebar. However, if this page uniquely enables your reader to tackle other nonsequential concepts throughout this book, or even external content, link to it here!

Resources and references

Finally, be rigorous in your citations and references as necessary. Give credit where credit is due. Also, feel free to link to relevant external material, further reading, documentation, etc. Then you’re done! Give yourself a quick review, a high five, and send us a pull request. A few final notes:

  • Kernel > Restart Kernel and Run All Cells... to confirm that your notebook will cleanly run from start to finish

  • Kernel > Restart Kernel and Clear All Outputs... before committing your notebook, our machines will do the heavy lifting

  • Take credit! Provide author contact information if you’d like; if so, consider adding information here at the bottom of your notebook

  • Give credit! Attribute appropriate authorship for referenced code, information, images, etc.

  • Only include what you’re legally allowed: no copyright infringement or plagiarism

Thank you for your contribution!