argopy.stores.httpstore.open_mfdataset#
- httpstore.open_mfdataset(urls, max_workers: int = 112, method: str = 'thread', progress: bool | str = False, concat: bool = True, concat_dim='row', preprocess=None, preprocess_opts={}, errors: str = 'ignore', *args, **kwargs)[source]#
Open multiple urls as a single xarray dataset.
This is a version of the
argopy.stores.httpstore.open_dataset
method that is able to handle a list of urls/paths sequentially or in parallel.Use a Threads Pool by default for parallelization.
- Parameters:
max_workers (int, default: 112) β Maximum number of threads or processes
method (str, default:
thread
) βThe parallelization method to execute calls asynchronously:
thread
(default): use a pool of at mostmax_workers
threadsprocess
: use a pool of at mostmax_workers
processesdistributed.client.Client
: Experimental, expect this method to fail !seq
: open data sequentially, no parallelization applied
progress (bool, default: False) β Display a progress bar
concat (bool, default: True) β Concatenate results in a single
xarray.Dataset
or not (in this case, function will return a list ofxarray.Dataset
)concat_dim (str, default:
row
) β Name of the dimension to use to concatenate all datasets (passed toxarray.concat
)preprocess (callable (optional)) β If provided, call this function on each dataset prior to concatenation
preprocess_opts (dict (optional)) β If
preprocess
is provided, pass this as optionserrors (str, default:
ignore
) βDefine how to handle errors raised during data URIs fetching:
raise
: Raise any error encounteredignore
(default): Do not stop processing, simply issue a debug message in logging consolesilent
: Do not stop processing and do not issue log message
kwargs (Other args and) β
- Returns:
output
- Return type:
xarray.Dataset
or list ofxarray.Dataset