1. Regional and Local Sea Level Change#

../_images/845aff7865f63a069e74ecdf1c2803a4dd2bd46b2d3f4aa348469a5b1358a940.png

In this notebook, we’ll be creating a table, a map, and a time series plot of absolute and regional sea level change at the Malakal, Palau tide gauge station from 1993-2022. Absolute sea level, typically measured by satellite altimetry, refers to the height of the sea surface relative to a reference ellipsoid. Here, we’ll use the global ocean gridded L4 Sea Level Anomalies (SLA) available from Copernicus, which is the sea surface height (SSH) minus the mean sea surface (MSS), where the MSS is the temporal mean of SSH over a given period. Relative sea level is measured by a tide gauge, and is the sea level relative to land at that location. Differences between the two measurements can arise from vertical land motion, regional oceanographic conditions like currents, and changes to the gravitational field (affecting the geoid).

Download Files: Map | Time Series Plot | Table

1.1. Setup#

First, we need to import all the necessary libraries.

# import necessary libraries
import numpy as np
import xarray as xr
import datetime as dt
from pathlib import Path
import pandas as pd
import os

# data retrieval libraries
import requests
from urllib.request import urlretrieve #used for downloading files
import json
import copernicus_marine_client as copernicusmarine

# data processing libraries
from scipy import stats
import seaborn as sns
import cartopy.crs as ccrs
import cartopy.feature as cfeature
import matplotlib.pyplot as plt

from myst_nb import glue #used for figure numbering when exporting to LaTeX
/Users/juliafiedler/anaconda3/envs/SLI39/lib/python3.9/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html
  from .autonotebook import tqdm as notebook_tqdm

Next, we’ll establish our directory for saving and reading data, and our output files.

data_dir = Path('../data')
output_dir = Path('../output')  

Then we’ll establish some basic plotting rules for this notebook to keep everything looking uniform.

Hide code cell source
plt.rcParams['figure.figsize'] = [10, 4]  # Set a default figure size for the notebook
plt.rcParams['figure.dpi'] = 100  # Set default resolution for inline figures

# Set the default font size for axes labels, titles and ticks
plt.rcParams['axes.titlesize'] = 16  # Set the font size for axes titles
plt.rcParams['axes.labelsize'] = 14  # Set the font size for x and y labels
plt.rcParams['xtick.labelsize'] = 12 # Set the font size for x-axis tick labels
plt.rcParams['ytick.labelsize'] = 12 # Set the font size for y-axis tick labels
plt.rcParams['font.size'] = 14 # Set the font size for the text in the figure (can affect legend)

1.2. Retrieve Data Sources#

We are interested in getting tide gauge and alitmetry data for Palau (and its EEZ) for 1993 through 2022. Let’s first establish where the tide gauge is by looking at the tide gauge dataset.

1.2.1. Retrieve the location of the Malakal, Palau tide gauge#

url = 'https://uhslc.soest.hawaii.edu/data/meta.geojson' #<<--- THIS IS A "HIDDEN" URL? I CAN'T FIND IT BY CLICKING AROUND THE WEBSITE. 
uhslc_id = 7

#parse this url to get lat/lon of Malakal tide gauge
r = requests.get(url)
data = r.json()
for i in range(len(data['features'])):
    if data['features'][i]['properties']['uhslc_id'] == uhslc_id:
        lat = data['features'][i]['geometry']['coordinates'][1]
        lon = data['features'][i]['geometry']['coordinates'][0]
        station = data['features'][i]['properties']['name']
        country = data['features'][i]['properties']['country']
        break

lat,lon,station, country    
(7.33, 134.463, 'Malakal', 'Palau')

Next, let’s establish a period of record from 1993-2022.

# establish the time period of interest
start_date = dt.datetime(1993,1,1)
end_date = dt.datetime(2022,12,31)

# also save them as strings, for plotting
start_date_str = start_date.strftime('%Y-%m-%d')
end_date_str = end_date.strftime('%Y-%m-%d')

glue("station",station,display=False)
glue("country",country, display=False)
glue("startDateTime",start_date_str, display=False)
glue("endDateTime",end_date_str, display=False)

1.2.2. Retrieve the EEZ boundary for Palau#

Next we’ll define the area we want to look at using the EEZ boundary. This can be obtained from XXX. For now it’s in the data directory.

# Retrieve the EEZ for Palau
with open(data_dir / 'palauEEZ.json') as f:
    palau = json.load(f)

# extract the coordinates of the EEZ
palau_eez = np.array(palau['features'][0]['geometry']['coordinates'][0][0])

# get the min and max lat and lon of the EEZ, helpful for obtaining CMEMS data
min_lon = np.min(palau_eez[:,0])
max_lon = np.max(palau_eez[:,0])
min_lat = np.min(palau_eez[:,1])
max_lat = np.max(palau_eez[:,1])

1.2.3. Retrieve altimetry data#

We are using the global ocean gridded L4 Sea Surface Heights and Derived Variables from Copernicus, available at https://doi.org/10.48670/moi-00148.

To download a subset of the global altimetry data, run get_CMEMS_data.py from this directory in a terminal with python >= 3.9 + copernicus_marine_client installed OR uncomment out the call to get_CMEMS_data and run it in this notebook. To read more about how to download the data from the Copernicus Marine Toolbox (new as of December 2023), visit https://help.marine.copernicus.eu/en/articles/7949409-copernicus-marine-toolbox-introduction.

Large data download!

Getting errors on the code block below? Remember to uncomment “get_CMEMS_data()” to download. Note that if you change nothing in the function, it will download ~600 MB of data, which may take a long time!! You will only need to do this once. The dataset will be stored in the data directory you specify (which should be the same data directory we defined above).

Hide code cell source
def get_CMEMS_data():
        
    maxlat = 15
    minlat = 0
    minlon = 125
    maxlon = 140
    
    start_date_str = "1993-01-01T00:00:00"
    end_date_str = "2023-04-30T23:59:59"
    data_dir = "../data"
    
    """
    Retrieves Copernicus Marine data for a specified region and time period.
    
    Args:
        minlon (float): Minimum longitude of the region.
        maxlon (float): Maximum longitude of the region.
        minlat (float): Minimum latitude of the region.
        maxlat (float): Maximum latitude of the region.
        start_date_str (str): Start date of the data in ISO 8601 format.
        end_date_str (str): End date of the data in ISO 8601 format.
        data_dir (str): Directory to save the retrieved data.
    
    Returns:
        str: The filename of the retrieved data.
    """
    copernicusmarine.subset(
        dataset_id="cmems_obs-sl_glo_phy-ssh_my_allsat-l4-duacs-0.25deg_P1D",
        variables=["adt", "sla"],
        minimum_longitude=minlon,
        maximum_longitude=maxlon,
        minimum_latitude=minlat,
        maximum_latitude=maxlat,
        start_datetime=start_date_str,
        end_datetime=end_date_str,
        output_directory=data_dir,
        output_filename="cmems_L4_SSH_0.25deg_" + start_date_str[0:4] + "_" + end_date_str[0:4] + ".nc"
    )
fname_cmems = 'cmems_L4_SSH_0.25deg_1993_2023.nc'

# check if the file exists, if not, download it
if not os.path.exists(data_dir / fname_cmems):
    print('You will need to download the CMEMS data in a separate script')
    get_CMEMS_data() #<<--- COMMENT OUT TO AVOID ACCIDENTAL DATA DOWNLOADS.
else:
    print('CMEMS data already downloaded, good to go!')

# open the CMEMS data
ds = xr.open_dataset(data_dir / fname_cmems)

# slice the data to time 1993-01-01 to 2022-12-31
ds = ds.sel(time=slice(start_date, end_date))
ds
CMEMS data already downloaded, good to go!
<xarray.Dataset>
Dimensions:    (time: 10957, latitude: 60, longitude: 60)
Coordinates:
  * latitude   (latitude) float32 0.125 0.375 0.625 0.875 ... 14.38 14.62 14.88
  * longitude  (longitude) float32 125.1 125.4 125.6 125.9 ... 139.4 139.6 139.9
  * time       (time) datetime64[ns] 1993-01-01 1993-01-02 ... 2022-12-31
Data variables:
    adt        (time, latitude, longitude) float64 ...
    sla        (time, latitude, longitude) float64 ...
Attributes: (12/45)
    Conventions:                       CF-1.6
    Metadata_Conventions:              Unidata Dataset Discovery v1.0
    cdm_data_type:                     Grid
    comment:                           Sea Surface Height measured by Altimet...
    contact:                           servicedesk.cmems@mercator-ocean.eu
    creator_email:                     servicedesk.cmems@mercator-ocean.eu
    ...                                ...
    time_coverage_duration:            P1D
    time_coverage_end:                 1993-01-01T12:00:00Z
    time_coverage_resolution:          P1D
    time_coverage_start:               1992-12-31T12:00:00Z
    title:                             DT merged all satellites Global Ocean ...
    copernicus_marine_client_version:  0.10.3

1.2.4. Retrieve Tide Gauge Data#

Next we’ll retrieve tide gauge data from the UHSLC (University of Hawaii Sea Level Center) fast-delivery dataset {cite:t}``. The fast-delivery data are released within 1-2 months of data collection and are subject only to basic quality control.

The code block below downloads the data file if it doesn’t already exist in the specified data directory. The tide gauge data is then opened using xarray and the station name is extracted.

uhslc_id = 7
fname = f'h{uhslc_id:03}.nc' # h for hourly, d for daily

url = "https://uhslc.soest.hawaii.edu/data/netcdf/fast/hourly/" 

path = os.path.join(data_dir, fname)

if not os.path.exists(path):
    urlretrieve(os.path.join(url, fname), path) 

    
rsl = xr.open_dataset(path)
station_name = rsl['station_name'].values[0]
rsl
<xarray.Dataset>
Dimensions:               (record_id: 1, time: 477345)
Coordinates:
  * time                  (time) datetime64[ns] 1969-05-18T15:00:00 ... 2023-...
  * record_id             (record_id) int16 70
Data variables:
    sea_level             (record_id, time) float32 ...
    lat                   (record_id) float32 ...
    lon                   (record_id) float32 ...
    station_name          (record_id) |S7 b'Malakal'
    station_country       (record_id) |S5 ...
    station_country_code  (record_id) float32 ...
    uhslc_id              (record_id) int16 ...
    gloss_id              (record_id) float32 ...
    ssc_id                (record_id) |S4 ...
    last_rq_date          (record_id) datetime64[ns] ...
Attributes:
    title:                  UHSLC Fast Delivery Tide Gauge Data (hourly)
    ncei_template_version:  NCEI_NetCDF_TimeSeries_Orthogonal_Template_v2.0
    featureType:            timeSeries
    Conventions:            CF-1.6, ACDD-1.3
    date_created:           2023-12-07T14:34:12Z
    publisher_name:         University of Hawaii Sea Level Center (UHSLC)
    publisher_email:        philiprt@hawaii.edu, markm@soest.hawaii.edu
    publisher_url:          http://uhslc.soest.hawaii.edu
    summary:                The UHSLC assembles and distributes the Fast Deli...
    processing_level:       Fast Delivery (FD) data undergo a level 1 quality...
    acknowledgment:         The UHSLC Fast Delivery database is supported by ...

1.3. Process the data#

Now we’ll convert tide gauge data into a daily record for the POR in units of meters to match the CMEMS data.

The next code block:

  • extracts tide gauge data for the period 1993-2022

  • converts it to meters

  • removes any NaN values

  • resamples the data to daily mean

  • and normalizes it relative to the 1993-2012 epoch.

The resulting data is stored in the variable ‘rsl_daily’ with units in meters.

# Extract the data for the period of record (POR)
tide_gauge_data_POR = rsl['sea_level'].sel(time=slice(start_date, end_date))

# Convert to meters and drop any NaN values
tide_gauge_data_meters = tide_gauge_data_POR / 1000  # Convert from mm to meters
tide_gauge_data_clean = tide_gauge_data_meters.dropna(dim='time')

# Resample the tide gauge data to daily mean before subtracting the epoch mean
tide_gauge_daily_avg = tide_gauge_data_clean.resample(time='1D').mean()

# Normalize the data relative to the 1993-2012 epoch
epoch_start, epoch_end = start_date, '2011-12-31'
epoch_daily_avg = tide_gauge_daily_avg.sel(time=slice(epoch_start, epoch_end))
epoch_daily_mean = epoch_daily_avg.mean()

# Subtract the epoch daily mean from the tide gauge daily average
rsl_daily = tide_gauge_daily_avg - epoch_daily_mean

# Set the attributes of the rsl_daily data
rsl_daily.attrs = tide_gauge_data_POR.attrs
rsl_daily.attrs['units'] = 'm'
rsl_daily.attrs['record_id'] = rsl_daily.coords['record_id'].values[0]

# Remove the 'record_id' dimension
rsl_daily = rsl_daily.drop('record_id')

# Remove any singleton dimensions
rsl_daily = rsl_daily.squeeze()

rsl_daily
<xarray.DataArray 'sea_level' (time: 10957)>
array([-0.12749314, -0.1223681 , -0.16636801, ...,  0.07934022,
        0.09979868, -0.28803468], dtype=float32)
Coordinates:
  * time     (time) datetime64[ns] 1993-01-01 1993-01-02 ... 2022-12-31
Attributes:
    long_name:  relative sea level
    units:      m
    source:     in situ tide gauge water level observations
    platform:   station_name, station_country, station_country_code, uhslc_id...
    record_id:  70

Run a quick check to see if the Ab SL from CMEMS is in fact zeroed about the 1993-2012 epoch. Curse the details.

# Normalize tide gauge data to CMEMS 1993-2012 epoch
epoch_daily_mean_cmems = ds['sla'].sel(
    time=slice(epoch_start, epoch_end)
).sel(
    longitude=lon, latitude=lat, method='nearest'
).mean(dim='time')

formatted_mean = format(epoch_daily_mean_cmems.values, ".2f")

# Print the mean with a note to re-check source data
print(
    f'The mean for the [1993,2012] epoch of the SLA is {formatted_mean} m. '
    'Re-check source data to make sure this is correct.'
)
The mean for the [1993,2012] epoch of the SLA is 0.02 m. Re-check source data to make sure this is correct.

1.3.1. Clip#

Next we’ll clip the Altimetry Data Set to the area/grid of interest. For now, we’ll focus only on the grid cell nearest to the Malakal tide gauge.

sla = ds['sla'].sel(time=slice(start_date, end_date))
time_cmems = pd.to_datetime(sla['time'].values)

# Extract data for the nearest point to the tide gauge location
sla_nearest = sla.sel(longitude=lon, latitude=lat, method='nearest')

sla_nearest_lat = sla_nearest['latitude'].values
sla_nearest_lon = sla_nearest['longitude'].values

# Format lat lon strings to have decimal symbol and N/S and E/W
lat_str = f'{np.abs(sla_nearest_lat):.3f}\u00B0{"N" if sla_nearest_lat > 0 else "S"}'
lon_str = f'{np.abs(sla_nearest_lon):.3f}\u00B0{"E" if sla_nearest_lon > 0 else "W"}'

print(f'The closest altimetry grid point is {lat_str}, {lon_str}')
sla_nearest
The closest altimetry grid point is 7.375°N, 134.375°E
<xarray.DataArray 'sla' (time: 10957)>
[10957 values with dtype=float64]
Coordinates:
    latitude   float32 7.375
    longitude  float32 134.4
  * time       (time) datetime64[ns] 1993-01-01 1993-01-02 ... 2022-12-31
Attributes:
    ancillary_variables:  err_sla
    comment:              The sea level anomaly is the sea surface height abo...
    grid_mapping:         crs
    long_name:            Sea level anomaly
    standard_name:        sea_surface_height_above_sea_level
    units:                m

Let’s make a map to determine exactly where these points are in space. First we’ll define a function to prettify the map, which we’ll use for the rest of the notebook.

Hide code cell content
def add_zebra_frame(ax, lw=2, segment_length=0.5, crs=ccrs.PlateCarree()):
    # Get the current extent of the map
    left, right, bot, top = ax.get_extent(crs=crs)

    # Calculate the nearest 0 or 0.5 degree mark within the current extent
    left_start = left - left % segment_length
    bot_start = bot - bot % segment_length

    # Adjust the start if it does not align with the desired segment start
    if left % segment_length >= segment_length / 2:
        left_start += segment_length
    if bot % segment_length >= segment_length / 2:
        bot_start += segment_length

    # Extend the frame slightly beyond the map extent to ensure full coverage
    right_end = right + (segment_length - right % segment_length)
    top_end = top + (segment_length - top % segment_length)

    # Calculate the number of segments needed for each side
    num_segments_x = int(np.ceil((right_end - left_start) / segment_length))
    num_segments_y = int(np.ceil((top_end - bot_start) / segment_length))

    # Draw horizontal stripes at the top and bottom
    for i in range(num_segments_x):
        color = 'black' if (left_start + i * segment_length) % (2 * segment_length) == 0 else 'white'
        start_x = left_start + i * segment_length
        end_x = start_x + segment_length
        ax.hlines([bot, top], start_x, end_x, colors=color, linewidth=lw, transform=crs)

    # Draw vertical stripes on the left and right
    for j in range(num_segments_y):
        color = 'black' if (bot_start + j * segment_length) % (2 * segment_length) == 0 else 'white'
        start_y = bot_start + j * segment_length
        end_y = start_y + segment_length
        ax.vlines([left, right], start_y, end_y, colors=color, linewidth=lw, transform=crs)
Text(134.375, 7.425, 'Nearest Altimetry \n Grid Point')
../_images/85d627a9532689e322bb6dfb0e073193a65ee0009ee9103d7a984dd5f0e8b71e.png

1.3.2. Plot the timeseries#

Here, we’ll plot the time series of the altimetry data at the nearest location to the tide gauge (aka ‘sla_nearest’). The units of sla_nearest are in meters, so we’ll multiply by 100 to plot in centimeters.

# Set the style of the plot
sns.set_style("whitegrid")

# Set the style of the plot
sns.set_style("whitegrid")

# Create a color palette
palette = sns.color_palette("Paired")

# Create the figure and the axes
fig, ax = plt.subplots()

# Plot the data

# plot altimetry data
ax.scatter(sla_nearest['time'], 100*sla_nearest, label='Altimetry', color=palette[1], alpha=1, s= 5)

# Set the title and labels
ax.set_title(f'Altimetry ({lat_str}, {lon_str}) ({start_date_str} to {end_date_str})')
ax.set_xlabel('Year')
ax.set_ylabel('Height (cm)')

# Set the y limits
ax.set_ylim([-35, 35])

# Set the x limits
ax.set_xlim([start_date, end_date])
(8401.0, 19357.0)
../_images/b16eeeb615caedba3e061b7c7370371d94f1bc06a8872ed1477b55d68fa250ef.png

1.3.3. Calculate change#

Now we have all of our data sources, we’ll calculate the absolute and relative sea level change (magnitude in cm) at this location for the Period of Record (1993-2022).

To do this, we’ll first define a function that calculates the sea level change magnitude.

def process_trend_with_nan(sea_level_anomaly):
    # Flatten the data and get a time index
    # first ensure time is the first dimension regardless of other dimensions
    sea_level_anomaly = sea_level_anomaly.transpose('time', ...)
    sla_flat = sea_level_anomaly.values.reshape(sea_level_anomaly.shape[0], -1)
    time_index = pd.to_datetime(sea_level_anomaly.time.values).to_julian_date()

    detrended_flat = np.full_like(sla_flat, fill_value=np.nan)

    # Loop over each grid point
    for i in range(sla_flat.shape[1]):
        # Get the time series for this grid point
        y = sla_flat[:,i]
        mask = ~np.isnan(y)

        if np.any(mask):
            time_masked = time_index[mask]
            y_masked = y[mask]

            slope, intercept, _, _, _ = stats.linregress(time_masked, y_masked)
            trend = slope * time_index + intercept

            detrended_flat[:,i] = y - trend

    detrended = detrended_flat.reshape(sea_level_anomaly.shape)

    # Calculate trend magnitude
    sea_level_trend = sea_level_anomaly - detrended
    trend_mag = sea_level_trend[-1] - sea_level_trend[0]

    times = pd.to_datetime(sea_level_anomaly['time'].values)
    time_mag = (times[-1] - times[0]).days/365.25 # in years

    trend_rate = trend_mag / time_mag

    return trend_mag, sea_level_trend, trend_rate  

Now we’ll run that function for every grid point in our dataset, and make special variables for our proxy tide gauge location.

trend_mag_cmems, trend_line_cmems, trend_rate_cmems = process_trend_with_nan(sla)

trend_mag_asl, trend_line_asl, trend_rate_asl = process_trend_with_nan(sla_nearest)

Calculate the weighted mean for the region of interest. (Probably not necessary for this application ???)

# calculate the area weights as cosine of latitude
# For a rectangular grid, this is equivalent to multiplying by the grid cell area
weights = np.cos(np.deg2rad(sla.latitude))
weights.name = "weights"

# apply weights to the trend data
trend_mag_weighted = trend_mag_cmems.weighted(weights)

# calculate the regional mean
trend_mag_regional = trend_mag_weighted.mean(dim=('latitude', 'longitude'))

# prepare the output string
output = (
    'The regional magnitude of sea level change is {:.2f} cm for the time '
    'period bounded by {} and {}.'
).format(100*trend_mag_regional.values, start_date_str, end_date_str)

print(output)
The regional magnitude of sea level change is 15.62 cm for the time period bounded by 1993-01-01 and 2022-12-31.

1.3.4. Plot a map#

Plot the Results (in a map that includes the EEZ) – MAP

Hide code cell source
def plot_map(vmin, vmax, xlims, ylims):
    """
    Plot a map of the magnitude of sea level change.

    Parameters:
    vmin (float): Minimum value for the color scale.
    vmax (float): Maximum value for the color scale.
    xlims (tuple): Tuple of min and max values for the x-axis limits.
    ylims (tuple): Tuple of min and max values for the y-axis limits.

    Returns:
    fig (matplotlib.figure.Figure): The matplotlib figure object.
    ax (matplotlib.axes._subplots.AxesSubplot): The matplotlib axes object.
    crs (cartopy.crs.Projection): The cartopy projection object.
    cmap (matplotlib.colors.Colormap): The colormap used for the plot.
    """
    crs = ccrs.PlateCarree()
    fig, ax = plt.subplots(figsize=(6, 6), subplot_kw={'projection': crs})
    ax.set_xlim(xlims)
    ax.set_ylim(ylims)

    palette = sns.color_palette("mako", as_cmap=True)
    cmap = palette

    ax.coastlines()
    ax.add_feature(cfeature.LAND, color='lightgrey')

    return fig, ax, crs, cmap

def plot_zebra_frame(ax, lw=5, segment_length=2, crs=ccrs.PlateCarree()):
    """
    Plot a zebra frame on the given axes.

    Parameters:
    - ax: The axes object on which to plot the zebra frame.
    - lw: The line width of the zebra frame. Default is 5.
    - segment_length: The length of each segment in the zebra frame. Default is 2.
    - crs: The coordinate reference system of the axes. Default is ccrs.PlateCarree().
    """
    # Call the function to add the zebra frame
    add_zebra_frame(ax=ax, lw=lw, segment_length=segment_length, crs=crs)
    # add map grid
    gl = ax.gridlines(draw_labels=True, linestyle=':', color='black',
                      alpha=0.5,xlocs=ax.get_xticks(),ylocs=ax.get_yticks())
    #remove labels from top and right axes
    gl.top_labels = False
    gl.right_labels = False
xlims = [128, 138]
ylims = [0, 13]
vmin, vmax = 6, 22

fig, ax, crs,cmap = plot_map(vmin,vmax,xlims,ylims)

# plot the trend*100 for centimeters
trend_mag_cmems_cm = trend_mag_cmems * 100

# plot a map of the magnitude of SL change in centimeters
trend_mag_cmems_cm.plot(ax=ax, transform=crs,
                         vmin=vmin, vmax=vmax, cmap=cmap, add_colorbar=True, 
                         cbar_kwargs={'label': 'Δ Sea Level (cm, 1993-2022)'},
)

# add the EEZ
ax.plot(palau_eez[:, 0], palau_eez[:, 1], transform=crs, color='black', linewidth=2)


# Call the function to add the zebra frame
plot_zebra_frame(ax, lw=5, segment_length=2, crs=crs)
../_images/12f9400f12a99530ea55dfcee935969e43bab0f8b1f433c2d3d7383670f5e9a1.png

1.3.5. Plot time series#

Now we’ll plot a time series that includes a trend line, the Absolute Sea Level Change (magnitude in cm) within area/s in proximity to the Tide Station/s

# Set the style of the plot
sns.set_style("whitegrid")

# Create a color palette
palette = sns.color_palette("Set1")

# Plot the timeseries
# Create the figure and the axes
fig, ax = plt.subplots()

# plot altimetry data
ax.scatter(sla_nearest['time'], 100*sla_nearest, label='Altimetry', 
           color=palette[0], alpha=0.2, s=5)
ax.plot(time_cmems, 100*trend_line_asl, label='Altimetry Trend', 
        color=palette[0], linestyle='--')

# Set the title and labels
title = f'Altimetry ({lat_str}, {lon_str}) ({start_date_str} to {end_date_str})'
ax.set_title(title)
ax.set_xlabel('Year')
ax.set_ylabel('Height (cm)')

# Set the x limits
ax.set_xlim([start_date, end_date])

# Add a legend
trendmag_str = (f'Δ Sea Level: {100*trend_mag_asl:.2f} cm, '
                f'Trend: {100*trend_rate_asl:.2f} cm/year')

# Add text in a white box to bottom right of plot
ax.text(0.95, 0.05, trendmag_str, transform=ax.transAxes, 
        verticalalignment='bottom', horizontalalignment='right', 
        bbox=dict(facecolor='white', alpha=0.5))
Text(0.95, 0.05, 'Δ Sea Level: 17.37 cm, Trend: 0.58 cm/year')
../_images/cd1086a25f214ee396b81b09f3df97e117a116cd88caed03e93266a373465f64.png

1.4. Retrieve the Tide Station Data#

1.4.1. Assess Station Data Quality#

for the POR (1993-2022 …consult w/Ayesha and John)

1.4.2. Plot the hourly time series#

For a quick glance at the data

# Set the style of the plot
sns.set_style("whitegrid")

# Set the style of the plot
sns.set_style("whitegrid")

# Create a color palette
palette = sns.color_palette("Paired")

# Create the figure and the axes
fig, ax = plt.subplots()

# Plot the data

# plot tide gauge data
ax.scatter(rsl_daily['time'],100*rsl_daily, label='RSL', 
           color=palette[1], alpha=1, s= 5)

# Set the title, labels, and limits
ax.set(
    title=f'Tide Gauge ({station_name}) ({start_date_str} to {end_date_str})',
    xlabel='Year',
    ylabel='Water Level (cm)',
    ylim=[-50, 50],
    xlim=[start_date, end_date]
);
../_images/7498583d2ef8601a40a8ecf5ce8af34954803b7fb002c66d60a95d32bc386a94.png

1.4.3. Calculate rate and magnitude of change#

Calculate values for both the Trend (rate of change) and Magnitude of Change

# calculate the rate of change for the tide gauge
trend_mag_rsl, trend_line_rsl, trend_rate_rsl = process_trend_with_nan(rsl_daily)

print(f'The trend magnitude for the tide gauge is {100*trend_mag_rsl:.2f} cm.')
print(f'The trend rate for the tide gauge is {100*trend_rate_rsl:.2f} cm/year.')
The trend magnitude for the tide gauge is 12.26 cm.
The trend rate for the tide gauge is 0.41 cm/year.

1.4.4. Plot time series#

Now we’ll combine our daily relative sea level time series, the monthly mean, and a trend line into the same plot. We will label it with the Relative Sea Level Sea Level Change (magnitude in cm) and rate of change (cm/yr) at the Tide Station. The “zero” line is referenced to the [1993,2012] period, following the altimetry.

# make an rsl monthly mean for plotting
rsl_monthly = rsl_daily.resample(time='1M').mean().squeeze()

# Set the style of the plot
sns.set_style("whitegrid")

# Create a color palette
palette = sns.color_palette("Set1")

# Plot the timeseries
# Create the figure and the axes
fig, ax = plt.subplots()
# plot altimetry data
ax.scatter(rsl_daily['time'], 100*rsl_daily, 
           label='Tide Gauge', color=palette[1], alpha=0.2, s= 5)
ax.plot(rsl_daily['time'], 100*trend_line_rsl, 
        label='Tide Gauge Trend', color=palette[1], linestyle='--')
# plot the monthly mean sea level
ax.plot(rsl_monthly['time'], 100*rsl_monthly, 
        label='Tide Gauge', color=palette[3])


# Set the title and labels
ax.set_title(f'Tide Gauge ({station_name}) ({start_date_str} to {end_date_str})')
ax.set_xlabel('Year')
ax.set_ylabel('Height (cm)')

# Set the y limits
ax.set_ylim([-50, 50])

# Set the x limits
ax.set_xlim([start_date, end_date])

# Add a legend
trendmag_str = f'Δ Sea Level: {100*trend_mag_rsl:.2f} cm, Trend: {100*trend_rate_rsl:.2f} cm/year'
# Add text in a white box to bottom right of plot
ax.text(0.95, 0.05, trendmag_str, transform=ax.transAxes, 
        fontsize=14, verticalalignment='bottom', horizontalalignment='right',
        bbox=dict(facecolor='white', alpha=0.5))
Text(0.95, 0.05, 'Δ Sea Level: 12.26 cm, Trend: 0.41 cm/year')
../_images/43147d14cc9e1060744606493810c58e45552d1bfd3d1847791799ed41a21733.png

1.5. Combining both sources#

1.5.1. Create a Table#

That compares the results of sections Section 1.3.5 and Section 1.4.4– TABLE

# Constants
DATA_SOURCE_ALTIMETRY = 'CMEMS SSH L4 0.25 deg (SLA)'
DATA_SOURCE_TIDE_GAUGE = 'UHSLC RQDS'
TIME_PERIOD = f'{start_date_str} to {end_date_str}'

# Calculated values 
trend_mmyr_altimetry = 1000 * trend_rate_asl.values
trend_mmyr_tide_gauge = 1000 * trend_rate_rsl.values
delta_sea_level_altimetry = 100 * trend_mag_asl.values
delta_sea_level_tide_gauge = 100 * trend_mag_rsl.values


# Create DataFrame
SL_magnitude_results = pd.DataFrame({
    'Trend (mm/yr)': [trend_mmyr_altimetry, trend_mmyr_tide_gauge],
    'Δ Sea Level (cm)': [delta_sea_level_altimetry, delta_sea_level_tide_gauge],
    'Latitude': [sla_nearest_lat, rsl['lat'].values[0]],
    'Longitude': [sla_nearest_lon, rsl['lon'].values[0]],
    'Time_Period': [TIME_PERIOD, TIME_PERIOD],
    'Data_Source': [DATA_SOURCE_ALTIMETRY, DATA_SOURCE_TIDE_GAUGE],
}, index=['Altimetry', 'Tide Gauge'])

# Save to CSV
output_file_path = output_dir / 'SL_magnitude_results.csv'

# Use the path for operations, e.g., saving a DataFrame
SL_magnitude_results.to_csv(output_file_path)

# glue trend_rates to the notebook
glue("trend_rate_rsl", trend_rate_rsl, display=False)
glue("trend_rate_asl", trend_rate_asl, display=False)

SL_magnitude_results
Trend (mm/yr) Δ Sea Level (cm) Latitude Longitude Time_Period Data_Source
Altimetry 5.790198 17.368217 7.375 134.375 1993-01-01 to 2022-12-31 CMEMS SSH L4 0.25 deg (SLA)
Tide Gauge 4.086637 12.258233 7.33 134.462997 1993-01-01 to 2022-12-31 UHSLC RQDS

1.5.2. Create a Map#

Now we’ll combine a both the tide gauge and altimetry sources into a map that includes the absolute change with the addition of an icon depicting the magnitude of relative change at the tide station.

fig, ax, crs,cmap = plot_map(vmin,vmax,xlims,ylims)

# plot the trend*100 for centimeters
trend_mag_cmems_cm = trend_mag_cmems * 100

# plot a map of the magnitude of SL change in centimeters
trend_mag_cmems_cm.plot(ax=ax, transform=crs,
                         vmin=vmin, vmax=vmax, cmap=cmap, add_colorbar=True, 
                         cbar_kwargs={'label': 'Δ Sea Level (cm, 1993-2022)'},
)

# add the EEZ
ax.plot(palau_eez[:, 0], palau_eez[:, 1], transform=crs, color='black', linewidth=2)
# add the tide gauge location with black outlined dot, colored by the sea level value
ax.scatter(rsl['lon'], rsl['lat'], transform=crs, s=200, 
           c=100*trend_mag_rsl, vmin=vmin, vmax=vmax, cmap=cmap,
           linewidth=0.5, edgecolor='black')

plot_zebra_frame(ax, lw=5, segment_length=2, crs=crs)

glue("mag_fig", fig, display=False)

# save the figure
output_file_path = output_dir / 'SL_magnitude_map.png'
fig.savefig(output_file_path, dpi=300, bbox_inches='tight')
Hide code cell output
../_images/18102f90bf44babf240b4b886385594c61aa6738a8462e72b3736dbe0dde67ce.png
../_images/18102f90bf44babf240b4b886385594c61aa6738a8462e72b3736dbe0dde67ce.png

Fig. 1.1 Map of absolute and relative sea level change from the altimetry and tide gauge record near Malakal, Palau station from 1993-01-01 to 2022-12-31.#

1.5.3. Create a Time series plot#

Finally we will combine both tide gauge and altimetry sources into a time series plot that includes both Absolute and Relative Time Series.

# Set the style of the plot
sns.set_style("whitegrid")

# Create a color palette
palette = sns.color_palette("Set1")

# Create the figure and the axes
fig, ax = plt.subplots()

# plot altimetry data
ax.scatter(sla_nearest['time'], 100*sla_nearest, 
           label='Altimetry', color=palette[0], alpha=0.2, s=5)
ax.plot(sla_nearest['time'], 100*trend_line_asl, 
        label='Altimetry Trend', color=palette[0], linestyle='--')

# plot tide gauge data
ax.scatter(rsl_daily['time'], 100*rsl_daily, 
           label='Tide Gauge', color=palette[1], alpha=0.2, s=10)
ax.plot(rsl_daily['time'], 100*trend_line_rsl, 
        label='Tide Gauge Trend', color=palette[1], linestyle='--')

# Set the title and labels
title = (
    f'Altimetry ({lat_str}, {lon_str}) vs \n'
    f'Tide Gauge ({station_name}) ({start_date_str} to {end_date_str})'
)
ax.set_title(title)
ax.set_xlabel('Year')
ax.set_ylabel('Height (cm, MHHW)')

# Set the y and x limits
ax.set_ylim([-40, 40])
ax.set_xlim([start_date, end_date])

# Add a legend below the plot
ax.legend(loc='upper center', bbox_to_anchor=(0.5, -0.15), ncol=2)

glue("trend_fig", fig, display=False)

# save the figure
output_file_path = output_dir / 'SL_magnitude_timeseries.png'
fig.savefig(output_file_path, dpi=300, bbox_inches='tight')
Hide code cell output
../_images/845aff7865f63a069e74ecdf1c2803a4dd2bd46b2d3f4aa348469a5b1358a940.png
../_images/845aff7865f63a069e74ecdf1c2803a4dd2bd46b2d3f4aa348469a5b1358a940.png

Fig. 1.2 Absolute sea level trend (red) from altimetry, and relative sea level trend (blue) from the tide gauge record at the Malakal, Palau station from 1993-01-01 to 2022-12-31.#