argopy.stores.filestore.open_mfdataset

argopy.stores.filestore.open_mfdataset#

filestore.open_mfdataset(urls, concat_dim='row', max_workers: int = 6, method: str = 'thread', progress: bool = False, concat: bool = True, preprocess=None, preprocess_opts={}, open_dataset_opts={}, errors: str = 'ignore', *args, **kwargs)[source]#

Open multiple urls as a single xarray dataset.

This is a version of the open_dataset method that is able to handle a list of urls/paths sequentially or in parallel.

Use a Threads Pool by default for parallelization.

Parameters:
  • urls (list(str)) โ€“ List of url/path to open

  • concat_dim (str) โ€“ Name of the dimension to use to concatenate all datasets (passed to xarray.concat)

  • max_workers (int) โ€“ Maximum number of threads or processes

  • method (str) โ€“

    The parallelization method to execute calls asynchronously:

    Use โ€˜seqโ€™ to simply open data sequentially

  • progress (bool) โ€“ Display a progress bar (True by default)

  • preprocess (callable (optional)) โ€“ If provided, call this function on each dataset prior to concatenation

  • errors (str) โ€“ Should it โ€˜raiseโ€™ or โ€˜ignoreโ€™ errors. Default: โ€˜ignoreโ€™

Return type:

xarray.Dataset