argopy.data_fetchers.gdac_data.GDACArgoDataFetcher#
- class GDACArgoDataFetcher(gdac: str = '', ds: str = '', cache: bool = False, cachedir: str = '', dimension: str = 'point', errors: str = 'raise', parallel: bool = False, progress: bool = False, api_timeout: int = 0, **kwargs)[source]#
Manage access to Argo data from a GDAC server
Warning
This class is a prototype not meant to be instantiated directly
- __init__(gdac: str = '', ds: str = '', cache: bool = False, cachedir: str = '', dimension: str = 'point', errors: str = 'raise', parallel: bool = False, progress: bool = False, api_timeout: int = 0, **kwargs)[source]#
Init fetcher
- Parameters:
gdac (str (optional)) – Path to the local or remote directory where the ‘dac’ folder is located
ds (str (optional)) – Dataset to load: ‘phy’ or ‘bgc’
cache (bool (optional)) – Cache data or not (default: False)
cachedir (str (optional)) – Path to cache folder
dimension (str, default: 'point') – Main dimension of the output dataset. This can be “profile” to retrieve a collection of profiles, or “point” (default) to have data as a collection of measurements. This can be used to optimise performances.
errors (str (optional)) – If set to ‘raise’ (default), will raise a NetCDF4FileNotFoundError error if any of the requested files cannot be found. If set to ‘ignore’, the file not found is skipped when fetching data.
parallel (bool, str,
distributed.Client
, default: False) –Set whether to use parallelization or not, and possibly which method to use.
- Possible values:
False
: no parallelization is usedTrue
: use default method specified by theparallel_default_method
optionany other values accepted by the
parallel_default_method
option
progress (bool (optional)) – Show a progress bar or not when fetching data.
api_timeout (int (optional)) – Server request time out in seconds. Set to OPTIONS[‘api_timeout’] by default.
Methods
__init__
([gdac, ds, cache, cachedir, ...])Init fetcher
clear_cache
()Remove cached files and entries from resources opened with this fetcher
cname
()Return a unique string defining the constraints
dashboard
(**kw)Return 3rd party dashboard for the access point
filter_data_mode
(ds, **kwargs)Apply xarray argo accessor filter_data_mode method
filter_points
(ds)filter_qc
(ds, **kwargs)filter_researchmode
(ds, *args, **kwargs)Filter dataset for research user mode
filter_variables
(ds, *args, **kwargs)Filter variables according to dataset and user mode
init
(*args, **kwargs)Initialisation for a specific fetcher
pre_process
(ds, *args, **kwargs)to_xarray
([errors])Load Argo data and return a
xarray.Dataset
transform_data_mode
(ds, **kwargs)Apply xarray argo accessor transform_data_mode method
uri_mono2multi
(URIs)Convert mono-profile URI files to multi-profile files
Attributes
cachepath
Return path to cache file(s) for this request
data_source
sha
Returns a unique SHA for a specific cname / fetcher implementation
uri
Return the list of files to load