argopy.data_fetchers.gdac_data.Fetch_box#
- class Fetch_box(ds: str = '', cache: bool = False, cachedir: str = '', parallel: bool = False, progress: bool = False, dimension: Literal['point', 'profile'] = 'point', errors: str = 'raise', api_timeout: int = 0, **kwargs)[source]#
Manage access to GDAC Argo data for: a rectangular space/time domain.
This class is instantiated when a call is made to these facade access points:
>>> ArgoDataFetcher(src='gdac').region(**)
- __init__(ds: str = '', cache: bool = False, cachedir: str = '', parallel: bool = False, progress: bool = False, dimension: Literal['point', 'profile'] = 'point', errors: str = 'raise', api_timeout: int = 0, **kwargs)#
Init fetcher
- Parameters:
ds (str, default = OPTIONS['ds']) – Dataset to load: ‘phy’ or ‘bgc’
cache (bool (optional)) – Cache data or not (default: False)
cachedir (str (optional)) – Path to cache folder
dimension (str, default: 'point') – Main dimension of the output dataset. This can be “profile” to retrieve a collection of profiles, or “point” (default) to have data as a collection of measurements. This can be used to optimise performances.
errors (str (optional)) – If set to ‘raise’ (default), will raise a NetCDF4FileNotFoundError error if any of the requested files cannot be found. If set to ‘ignore’, the file not found is skipped when fetching data.
parallel (bool, str,
distributed.Client
, default: False) –Set whether to use parallelization or not, and possibly which method to use.
- Possible values:
False
: no parallelization is usedTrue
: use default method specified by theparallel_default_method
optionany other values accepted by the
parallel_default_method
option
progress (bool (optional)) – Show a progress bar or not when fetching data.
api_timeout (int (optional)) – Server request time out in seconds. Set to OPTIONS[‘api_timeout’] by default.
gdac (str, default = OPTIONS['gdac']) – Path to the local or remote directory where the ‘dac’ folder is located
Methods
__init__
([ds, cache, cachedir, parallel, ...])Init fetcher
clear_cache
()Remove cached files and entries from resources opened with this fetcher
cname
()Return a unique string defining the constraints
dashboard
(**kw)Return 3rd party dashboard for the access point
filter_data_mode
(ds, **kwargs)Apply xarray argo accessor filter_data_mode method
filter_points
(ds)filter_qc
(ds, **kwargs)filter_researchmode
(ds, *args, **kwargs)Filter dataset for research user mode
filter_variables
(ds, *args, **kwargs)Filter variables according to dataset and user mode
init
(box[, nrows])Create Argo data loader
pre_process
(ds, *args, **kwargs)to_xarray
([errors, concat, concat_method, ...])Load Argo data and return a
xarray.Dataset
transform_data_mode
(ds, **kwargs)Apply xarray argo accessor transform_data_mode method
uri_mono2multi
(URIs)Convert mono-profile URI files to multi-profile files
Attributes
cachepath
Return path to cache file(s) for this request
data_source
sha
Returns a unique SHA for a specific cname / fetcher implementation
uri
List of files to load for a request