argopy.data_fetchers.erddap_data.ErddapArgoDataFetcher#
- class ErddapArgoDataFetcher(ds: str = '', cache: bool = False, cachedir: str = '', parallel: bool = False, progress: bool = False, chunks: str = 'auto', chunks_maxsize: dict = {}, api_timeout: int = 0, params: str | list = 'all', measured: str | list | None = None, **kwargs)[source]#
Manage access to Argo data through Ifremer ERDDAP
ERDDAP transaction are managed with the ioos/erddapy library
This class is a prototype not meant to be instantiated directly
- __init__(ds: str = '', cache: bool = False, cachedir: str = '', parallel: bool = False, progress: bool = False, chunks: str = 'auto', chunks_maxsize: dict = {}, api_timeout: int = 0, params: str | list = 'all', measured: str | list | None = None, **kwargs)[source]#
Instantiate an ERDDAP Argo data fetcher
- Parameters:
ds (str, default = OPTIONS['ds']) – Dataset to load: ‘phy’ or ‘ref’ or ‘bgc-s’
cache (bool (optional)) – Cache data or not (default: False)
cachedir (str (optional)) – Path to cache folder
parallel (bool, str,
distributed.Client
, default: False) –Set whether to use parallelization or not, and possibly which method to use.
- Possible values:
False
: no parallelization is usedTrue
: use default method specified by theparallel_default_method
optionany other values accepted by the
parallel_default_method
option
progress (bool (optional)) – Show a progress bar or not when
parallel
is set to True.chunks ('auto' or dict of integers (optional)) – Dictionary with request access point as keys and number of chunks to create as values. Eg: {‘wmo’: 10} will create a maximum of 10 chunks along WMOs when used with
Fetch_wmo
.chunks_maxsize (dict (optional)) – Dictionary with request access point as keys and chunk size as values (used as maximum values in ‘auto’ chunking). Eg: {‘wmo’: 5} will create chunks with as many as 5 WMOs each.
api_timeout (int (optional)) – Erddap request time out in seconds. Set to OPTIONS[‘api_timeout’] by default.
params (Union[str, list] (optional, default='all')) – List of BGC essential variables to retrieve, i.e. that will be in the output
xr.DataSet`
. By default, this is set toall
, i.e. any variable found in at least of the profile in the data selection will be included in the output.measured (Union[str, list] (optional, default=None)) – List of BGC essential variables that can’t be NaN. If set to ‘all’, this is an easy way to reduce the size of the
xr.DataSet`
to points where all variables have been measured. Otherwise, provide a simple list of variables.server (str, default = OPTIONS['erddap']) – URL to erddap server
mode (str, default = OPTIONS['mode'])
Methods
__init__
([ds, cache, cachedir, parallel, ...])Instantiate an ERDDAP Argo data fetcher
clear_cache
()Remove cache files and entries from resources opened with this fetcher
cname
()Return a unique string defining the constraints
dashboard
(**kw)Return 3rd party dashboard for the access point
define_constraints
()Define erddapy constraints
filter_data_mode
(ds, **kwargs)Apply xarray argo accessor filter_data_mode method
filter_measured
(ds)Re-enforce the 'measured' criteria for BGC requests
filter_qc
(ds, **kwargs)Apply xarray argo accessor filter_qc method
filter_researchmode
(ds, *args, **kwargs)Filter dataset for research user mode
filter_variables
(ds, *args, **kwargs)Filter variables according to dataset and user mode
get_url
()Return the URL to download requested data
init
(*args, **kwargs)Initialisation for a specific fetcher
pre_process
(this_ds, *args, **kwargs)to_xarray
([errors, add_dm, concat, max_workers])Load Argo data and return a xarray.DataSet
transform_data_mode
(ds, **kwargs)Apply xarray argo accessor transform_data_mode method
Attributes
N_POINTS
Number of measurements expected to be returned by a request
cachepath
Return path to cached file(s) for this request
data_source
server
URL of the Erddap server
sha
Returns a unique SHA for a specific cname / fetcher implementation
uri
Return the list of Unique Resource Identifier (URI) to download data