argopy.stores.httpstore.open_dataset

argopy.stores.httpstore.open_dataset#

httpstore.open_dataset(url: str, errors: Literal['raise', 'ignore', 'silent'] = 'raise', lazy: bool = False, dwn_opts: dict = {}, xr_opts: dict = {}, **kwargs) Dataset[source]#

Create a xarray.Dataset from an url pointing to a netcdf file

Parameters:
  • url (str) – The remote URL of the netcdf file to open

  • errors (Literal, default: raise) –

    Define how to handle errors raised during data fetching:
    • raise (default): Raise any error encountered

    • ignore: Do not stop processing, simply issue a debug message in logging console

    • silent: Do not stop processing and do not issue log message

  • lazy (bool, default=False) –

    Define if we should try to load netcdf file lazily or not

    If this is set to False (default) opening is done in 2 steps:
    1. Download from url raw binary data with httpstore.download_url,

    2. Create a xarray.Dataset with xarray.open_dataset().

    Each functions can be passed specifics arguments with dwn_opts and xr_opts (see below).

    If this is set to True, use a ArgoKerchunker instance to access the netcdf file lazily. You can provide a specific ArgoKerchunker instance with the ak argument (see below).

  • dwn_opts (dict, default={}) – Options passed to httpstore.download_url()

  • xr_opts (dict, default={}) – Options passed to xarray.open_dataset()

  • ak (ArgoKerchunker, optional) – ArgoKerchunker instance to use if lazy=True.

  • akoverwrite (bool, optional) – Determine if kerchunk data should be overwritten or not. This is passed to ArgoKerchunker.to_kerchunk().

Return type:

xarray.Dataset

Raises:
  • TypeError – Raised if data returned by url are not CDF or HDF5 binary data.

  • DataNotFound – Raised if errors is set to raise and url returns no data.