argopy.stores.filestore.open_mfdataset#
- filestore.open_mfdataset(urls, concat_dim='row', max_workers: int = 6, method: str = 'thread', progress: bool = False, concat: bool = True, preprocess=None, preprocess_opts={}, open_dataset_opts={}, errors: str = 'ignore', *args, **kwargs)[source]#
Open multiple urls as a single xarray dataset.
This is a version of the
open_dataset
method that is able to handle a list of urls/paths sequentially or in parallel.Use a Threads Pool by default for parallelization.
- Parameters:
concat_dim (str) โ Name of the dimension to use to concatenate all datasets (passed to
xarray.concat
)max_workers (int) โ Maximum number of threads or processes
method (str) โ
- The parallelization method to execute calls asynchronously:
thread
(Default): use a pool of at mostmax_workers
threadsprocess
: use a pool of at mostmax_workers
processes(XFAIL) a
distributed.client.Client
object (distributed.client.Client
)
Use โseqโ to simply open data sequentially
progress (bool) โ Display a progress bar (True by default)
preprocess (callable (optional)) โ If provided, call this function on each dataset prior to concatenation
errors (str) โ Should it โraiseโ or โignoreโ errors. Default: โignoreโ
- Return type: