Open In Colab   Open in Kaggle

Sea Level Rise#

Content creators: Aakash Sane, Karsten Haustein

Content reviewers: Brodie Pearson, Abigail Bodner, Jenna Pearson, Chi Zhang, Ohad Zivan

Content editors: Zane Mitrevica, Natalie Steinemann, Jenna Pearson, Chi Zhang, Ohad Zivan

Production editors: Wesley Banfield, Jenna Pearson, Chi Zhang, Ohad Zivan

Our 2023 Sponsors: NASA TOPS and Google DeepMind

# @title Project Background

from ipywidgets import widgets
from IPython.display import YouTubeVideo
from IPython.display import IFrame
from IPython.display import display


class PlayVideo(IFrame):
    def __init__(self, id, source, page=1, width=400, height=300, **kwargs):
        self.id = id
        if source == "Bilibili":
            src = f"https://player.bilibili.com/player.html?bvid={id}&page={page}"
        elif source == "Osf":
            src = f"https://mfr.ca-1.osf.io/render?url=https://osf.io/download/{id}/?direct%26mode=render"
        super(PlayVideo, self).__init__(src, width, height, **kwargs)


def display_videos(video_ids, W=400, H=300, fs=1):
    tab_contents = []
    for i, video_id in enumerate(video_ids):
        out = widgets.Output()
        with out:
            if video_ids[i][0] == "Youtube":
                video = YouTubeVideo(
                    id=video_ids[i][1], width=W, height=H, fs=fs, rel=0
                )
                print(f"Video available at https://youtube.com/watch?v={video.id}")
            else:
                video = PlayVideo(
                    id=video_ids[i][1],
                    source=video_ids[i][0],
                    width=W,
                    height=H,
                    fs=fs,
                    autoplay=False,
                )
                if video_ids[i][0] == "Bilibili":
                    print(
                        f"Video available at https://www.bilibili.com/video/{video.id}"
                    )
                elif video_ids[i][0] == "Osf":
                    print(f"Video available at https://osf.io/{video.id}")
            display(video)
        tab_contents.append(out)
    return tab_contents


video_ids = [('Youtube', 'FzXJ00pg34g'), ('Bilibili', 'BV1J14y197TT')]
tab_contents = display_videos(video_ids, W=730, H=410)
tabs = widgets.Tab()
tabs.children = tab_contents
for i in range(len(tab_contents)):
    tabs.set_title(i, video_ids[i][0])
display(tabs)
# @title Tutorial slides
# @markdown These are the slides for the videos in all tutorials today
from IPython.display import IFrame
link_id = "u7x62"

Sea level, or Sea Surface Height [SSH], describes the vertical position of the interface between the atmosphere and the ocean. It varies at numerous timescales attributable to different physical factors, such as hourly tides, daily to monthly perturbations caused by currents and storms, and alterations spanning several decades to centuries due to thermal expansion of seawater and the reduction of mass resulting from glaciers and ice sheets. Read more: NOAA 2022 Sea level rise technical report.

In this project, you will work on sea level rise data from ECCO model (recall W1D2 tutorial 4 outputs) and tidal gauge datasets.

Project Template#

Project Template

Note: The dashed boxes are socio-economic questions.

Data Exploration Notebook#

Project Setup#

# imports
import random
import numpy as np
import matplotlib.pyplot as plt
import xarray as xr
import os
import pooch
import tempfile
# helper functions

def pooch_load(filelocation=None,filename=None,processor=None):
    shared_location='/home/jovyan/shared/Data/Projects/Sea_Level' # this is different for each day
    user_temp_cache=tempfile.gettempdir()
    
    if os.path.exists(os.path.join(shared_location,filename)):
        file = os.path.join(shared_location,filename)
    else:
        file = pooch.retrieve(filelocation,known_hash=None,fname=os.path.join(user_temp_cache,filename),processor=processor)

    return file

ECCO Sea Surface Height (SSH)#

In this project, you will analyse sea surface height (SSH) data using the ECCO reanalysis product which combines simulations and observations. ECCO stands for Estimating the Circulation and Climate of the Ocean and integrates observations with coupled ocean/sea-ice models. The netCDF data file contains SSH stored as monthly means from the year 1992 to 2017 on a 0.5 x 0.5 degree grid. Using the ECCO product, global and regional sea level obtained due to physical effects (such as thermal expansion of sea water, etc.) can be estimated. Further details about the dataset can be obtained here.

The sea surface height variable is called ‘SSH’ in the data. It is a variable with three gridded dimensions: time, latitude, and longitude. The code below shows how to load the SSH dataset and provides plotting examples. One example plots the time-series at a particular latitude and longitude while another example plots a colormap on the global grid. Those examples should equip you to tackle many of the questions on the template, so go ahead and explore!

Further resources about the dataset:

url_ECCO='~/shared/Data/Projects/Sea_Level/SEA_SURFACE_HEIGHT_mon_mean_1992-01-2017-12_ECCO_V4r4_latlon_0p50deg.nc'
ds=xr.open_dataset(url_ECCO)
ds
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/file_manager.py:211, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
    210 try:
--> 211     file = self._cache[self._key]
    212 except KeyError:

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/lru_cache.py:56, in LRUCache.__getitem__(self, key)
     55 with self._lock:
---> 56     value = self._cache[key]
     57     self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/wesley/shared/Data/Projects/Sea_Level/SEA_SURFACE_HEIGHT_mon_mean_1992-01-2017-12_ECCO_V4r4_latlon_0p50deg.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False)), '71ea0463-771c-4bce-807e-d9bd1e1a46eb']

During handling of the above exception, another exception occurred:

FileNotFoundError                         Traceback (most recent call last)
Cell In[5], line 2
      1 url_ECCO='~/shared/Data/Projects/Sea_Level/SEA_SURFACE_HEIGHT_mon_mean_1992-01-2017-12_ECCO_V4r4_latlon_0p50deg.nc'
----> 2 ds=xr.open_dataset(url_ECCO)
      3 ds

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/api.py:566, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    554 decoders = _resolve_decoders_kwargs(
    555     decode_cf,
    556     open_backend_dataset_parameters=backend.open_dataset_parameters,
   (...)
    562     decode_coords=decode_coords,
    563 )
    565 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 566 backend_ds = backend.open_dataset(
    567     filename_or_obj,
    568     drop_variables=drop_variables,
    569     **decoders,
    570     **kwargs,
    571 )
    572 ds = _dataset_from_backend_dataset(
    573     backend_ds,
    574     filename_or_obj,
   (...)
    584     **kwargs,
    585 )
    586 return ds

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:590, in NetCDF4BackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose)
    569 def open_dataset(  # type: ignore[override]  # allow LSP violation, not supporting **kwargs
    570     self,
    571     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
   (...)
    587     autoclose=False,
    588 ) -> Dataset:
    589     filename_or_obj = _normalize_path(filename_or_obj)
--> 590     store = NetCDF4DataStore.open(
    591         filename_or_obj,
    592         mode=mode,
    593         format=format,
    594         group=group,
    595         clobber=clobber,
    596         diskless=diskless,
    597         persist=persist,
    598         lock=lock,
    599         autoclose=autoclose,
    600     )
    602     store_entrypoint = StoreBackendEntrypoint()
    603     with close_on_error(store):

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:391, in NetCDF4DataStore.open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose)
    385 kwargs = dict(
    386     clobber=clobber, diskless=diskless, persist=persist, format=format
    387 )
    388 manager = CachingFileManager(
    389     netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
    390 )
--> 391 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:338, in NetCDF4DataStore.__init__(self, manager, group, mode, lock, autoclose)
    336 self._group = group
    337 self._mode = mode
--> 338 self.format = self.ds.data_model
    339 self._filename = self.ds.filepath()
    340 self.is_remote = is_remote_uri(self._filename)

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:400, in NetCDF4DataStore.ds(self)
    398 @property
    399 def ds(self):
--> 400     return self._acquire()

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:394, in NetCDF4DataStore._acquire(self, needs_lock)
    393 def _acquire(self, needs_lock=True):
--> 394     with self._manager.acquire_context(needs_lock) as root:
    395         ds = _nc4_require_group(root, self._group, self._mode)
    396     return ds

File ~/miniconda3/envs/climatematch/lib/python3.10/contextlib.py:135, in _GeneratorContextManager.__enter__(self)
    133 del self.args, self.kwds, self.func
    134 try:
--> 135     return next(self.gen)
    136 except StopIteration:
    137     raise RuntimeError("generator didn't yield") from None

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/file_manager.py:199, in CachingFileManager.acquire_context(self, needs_lock)
    196 @contextlib.contextmanager
    197 def acquire_context(self, needs_lock=True):
    198     """Context manager for acquiring a file."""
--> 199     file, cached = self._acquire_with_cache_info(needs_lock)
    200     try:
    201         yield file

File ~/miniconda3/envs/climatematch/lib/python3.10/site-packages/xarray/backends/file_manager.py:217, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
    215     kwargs = kwargs.copy()
    216     kwargs["mode"] = self._mode
--> 217 file = self._opener(*self._args, **kwargs)
    218 if self._mode == "w":
    219     # ensure file doesn't get overridden when opened again
    220     self._mode = "a"

File src/netCDF4/_netCDF4.pyx:2464, in netCDF4._netCDF4.Dataset.__init__()

File src/netCDF4/_netCDF4.pyx:2027, in netCDF4._netCDF4._ensure_nc_success()

FileNotFoundError: [Errno 2] No such file or directory: '/home/wesley/shared/Data/Projects/Sea_Level/SEA_SURFACE_HEIGHT_mon_mean_1992-01-2017-12_ECCO_V4r4_latlon_0p50deg.nc'
ds["SSH"][:, 200, 134].plot()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Cell In[6], line 1
----> 1 ds["SSH"][:, 200, 134].plot()

NameError: name 'ds' is not defined
ds["SSH"][100, :, :].plot.pcolormesh()
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Cell In[7], line 1
----> 1 ds["SSH"][100, :, :].plot.pcolormesh()

NameError: name 'ds' is not defined

Observations Dataset: Tidal Gauges#

Students can download any tidal gauge data of their choice from this website

It is recommended to download the NetCDF ‘daily’ data for a particular location and it can be compared to the nearest latitude-longitude from the ECCO dataset. When you download the tidal gauge data, you can select a location, right click on the NetCDF of the data you want, copy link address and paste as the url below.

The file will have the sea level stored as a variable called ‘sea_level’, which is a function of time. It can be fun to explore how close the tidal gauge data agree (or disagree) with the ECCO data!

# students can download any tidal gauge data of their choice from this website:
# https://uhslc.soest.hawaii.edu/data/
# instructions: select a location, right click on the netcdf of the data you want, copy link address and paste as the url below
# data source-specific functions
url_choosen = "https://uhslc.soest.hawaii.edu/data/netcdf/fast/daily/d825.nc"  # this is the link for "Cuxhaven Germany", change to your location
# example code after downloading tidal gauge data:
ds = xr.open_dataset(
    pooch.retrieve(url_choosen, known_hash=None)
)  # this is just an example, tidal gauge NetCDF file needs to be downloaded in order to load this.
ds
<xarray.Dataset>
Dimensions:               (record_id: 1, time: 38504)
Coordinates:
  * time                  (time) datetime64[ns] 1917-12-30T12:00:00 ... 2023-...
  * record_id             (record_id) int16 8250
Data variables:
    sea_level             (record_id, time) float32 ...
    lat                   (record_id) float32 ...
    lon                   (record_id) float32 ...
    station_name          (record_id) |S8 ...
    station_country       (record_id) |S7 ...
    station_country_code  (record_id) float32 ...
    uhslc_id              (record_id) int16 ...
    gloss_id              (record_id) float32 ...
    ssc_id                (record_id) |S4 ...
    last_rq_date          (record_id) datetime64[ns] ...
Attributes:
    title:                  UHSLC Fast Delivery Tide Gauge Data (daily)
    ncei_template_version:  NCEI_NetCDF_TimeSeries_Orthogonal_Template_v2.0
    featureType:            timeSeries
    Conventions:            CF-1.6, ACDD-1.3
    date_created:           2023-07-16T14:26:10Z
    publisher_name:         University of Hawaii Sea Level Center (UHSLC)
    publisher_email:        philiprt@hawaii.edu, markm@soest.hawaii.edu
    publisher_url:          http://uhslc.soest.hawaii.edu
    summary:                The UHSLC assembles and distributes the Fast Deli...
    processing_level:       Fast Delivery (FD) data undergo a level 1 quality...
    acknowledgment:         The UHSLC Fast Delivery database is supported by ...

Further Reading#

  • 2022 Sea Level Rise Technical Report https://oceanservice.noaa.gov/hazards/sealevelrise/sealevelrise-tech-report-sections.html

  • Oppenheimer, M., B.C. Glavovic , J. Hinkel, R. van de Wal, A.K. Magnan, A. Abd-Elgawad, R. Cai, M. Cifuentes-Jara, R.M. DeConto, T. Ghosh, J. Hay, F. Isla, B. Marzeion, B. Meyssignac, and Z. Sebesvari, 2019: Sea Level Rise and Implications for Low-Lying Islands, Coasts and Communities. In: IPCC Special Report on the Ocean and Cryosphere in a Changing Climate [H.-O. Pörtner, D.C. Roberts, V. Masson-Delmotte, P. Zhai, M. Tignor, E. Poloczanska, K. Mintenbeck, A. Alegría, M. Nicolai, A. Okem, J. Petzold, B. Rama, N.M. Weyer (eds.)]. Cambridge University Press, Cambridge, UK and New York, NY, USA, pp. 321-445. https://doi.org/10.1017/9781009157964.006.

  • Domingues, R., Goni, G., Baringer, M., &Volkov, D. (2018). What caused theaccelerated sea level changes along theU.S. East Coast during 2010–2015?Geophysical Research Letters,45,13,367–13,376. https://doi.org/10.1029/2018GL081183Received

  • Church, J.A., P.U. Clark, A. Cazenave, J.M. Gregory, S. Jevrejeva, A. Levermann, M.A. Merrifield, G.A. Milne, R.S. Nerem, P.D. Nunn, A.J. Payne, W.T. Pfeffer, D. Stammer and A.S. Unnikrishnan, 2013: Sea Level Change. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change [Stocker, T.F., D. Qin, G.-K. Plattner, M. Tignor, S.K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex and P.M. Midgley (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA. https://www.ipcc.ch/site/assets/uploads/2018/02/WG1AR5_Chapter13_FINAL.pdf

  • Gregory, J.M., Griffies, S.M., Hughes, C.W. et al. Concepts and Terminology for Sea Level: Mean, Variability and Change, Both Local and Global. Surv Geophys 40, 1251–1289 (2019). https://doi.org/10.1007/s10712-019-09525-z

  • Wang, J., Church, J. A., Zhang, X., Gregory, J. M., Zanna, L., & Chen, X. (2021). Evaluation of the Local Sea‐Level Budget at Tide Gauges Since 1958. Geophysical Research Letters, 48(20), e2021GL094502. https://doi.org/10.1029/2021GL094502

  • Cazenave, A. and Cozannet, G.L. (2014), Sea level rise and its coastal impacts. Earth’s Future, 2: 15-34. https://doi-org.ezproxy.princeton.edu/10.1002/2013EF000188

  • Mimura N. Sea-level rise caused by climate change and its implications for society. Proc Jpn Acad Ser B Phys Biol Sci. 2013;89(7):281-301. doi: 10.2183/pjab.89.281. PMID: 23883609; PMCID: PMC3758961. doi: 10.2183/pjab.89.281

Resources#

This tutorial uses data from the simulations conducted as part of the CMIP6 multi-model ensemble.

For examples on how to access and analyze data, please visit the Pangeo Cloud CMIP6 Gallery

For more information on what CMIP is and how to access the data, please see this page.