What’s New#

release date PyPI Conda

pypi dwn conda dwn

v0.1.12 (16 May 2022)#

Internals

v0.1.11 (13 Apr. 2022)#

Features and front-end API

  • New data source ``gdac`` to retrieve data from a GDAC compliant source, for DataFetcher and IndexFetcher. You can specify the FTP source with the ftp fetcher option or with the argopy global option ftp. The FTP source support http, ftp or local files protocols. This fetcher is optimised if pyarrow is available, otherwise pandas dataframe are used. See update on Data sources. (#157) by G. Maze

from argopy import IndexFetcher
from argopy import DataFetcher
argo = IndexFetcher(src='gdac')
argo = DataFetcher(src='gdac')
argo = DataFetcher(src='gdac', ftp="https://data-argo.ifremer.fr")  # Default and fastest !
argo = DataFetcher(src='gdac', ftp="ftp://ftp.ifremer.fr/ifremer/argo")
with argopy.set_options(src='gdac', ftp='ftp://usgodae.org/pub/outgoing/argo'):
    argo = DataFetcher()

Note

The new gdac fetcher uses Argo index to determine which profile files to load. Hence, this fetcher may show poor performances when used with a region access point. Don’t hesitate to check Performances to try to improve performances, otherwise, we recommend to use a webAPI access point (erddap or argovis).

Warning

Since the new gdac fetcher can use a local copy of the GDAC ftp server, the legacy localftp fetcher is now deprecated. Using it will raise a warning up to v0.1.12. It will then raise an error in v0.1.13 and will be removed afterward.

  • New dashboard for profiles and new 3rd party dashboards. Calling on the data fetcher dashboard method will return the Euro-Argo profile page for a single profile. Very useful to look at the data before load. This comes with 2 new utilities functions to get Coriolis ID of profiles (utilities.get_coriolis_profile_id()) and to return the list of profile webpages (utilities.get_ea_profile_page()). (#198) by G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
ArgoDataFetcher().profile(5904797, 11).dashboard()
from argopy.utilities import get_coriolis_profile_id, get_ea_profile_page
get_coriolis_profile_id([6902755, 6902756], [11, 12])
get_ea_profile_page([6902755, 6902756], [11, 12])

The new profile dashboard can also be accessed with:

import argopy
argopy.dashboard(5904797, 11)

We added the Ocean-OPS (former JCOMMOPS) dashboard for all floats and the Argo-BGC dashboard for BGC floats:

import argopy
argopy.dashboard(5904797, type='ocean-ops')
# or
argopy.dashboard(5904797, 12, type='bgc')
  • New utility function :class:`argopy.utilities.ArgoNVSReferenceTables` to retrieve Argo Reference Tables. (@cc8fdbe) by G. Maze.

from argopy.utilities import ArgoNVSReferenceTables
R = ArgoNVSReferenceTables()
R.all_tbl_name()
R.tbl(3)
R.tbl('R09')

Internals

  • gdac and localftp data fetchers can return an index without loading the data. (#157) by G. Maze

from argopy import DataFetcher
argo = DataFetcher(src='gdac').float(6903076)
argo.index
  • New index store design. A new index store is used by data and index gdac fetchers to handle access and search in Argo index csv files. It uses pyarrow table if available or pandas dataframe otherwise. More details at Argo index store. Directly using this index store is not recommended but provides better performances for expert users interested in Argo sampling analysis.

from argopy.stores.argo_index_pa import indexstore_pyarrow as indexstore
idx = indexstore(host="https://data-argo.ifremer.fr", index_file="ar_index_global_prof.txt")  # Default
idx.load()
idx.search_lat_lon_tim([-60, -55, 40., 45., '2007-08-01', '2007-09-01'])
idx.N_MATCH  # Return number of search results
idx.to_dataframe()  # Convert search results to a dataframe
  • Refactoring of CI tests to use more fixtures and pytest parametrize. (#157) by G. Maze

  • Fix bug in erddap fata fetcher that was causing a profile request to do not account for cycle numbers. (@301e557) by G. Maze.

Breaking changes

  • Index fetcher for local FTP no longer support the option index_file. The name of the file index is internally determined using the dataset requested: ar_index_global_prof.txt for ds='phy' and argo_synthetic-profile_index.txt for ds='bgc'. Using this option will raise a deprecation warning up to v0.1.12 and will then raise an error. (#157) by G. Maze

  • Complete refactoring of the argopy.plotters module into argopy.plot. (#198) by G. Maze.

  • Remove deprecation warnings for: ‘plotters.plot_dac’, ‘plotters.plot_profilerType’. These now raise an error.

v0.1.10 (4 Mar. 2022)#

Internals

  • Update and clean up requirements. Remove upper bound on all dependencies (#182) by R. Abernathey.

v0.1.9 (19 Jan. 2022)#

Features and front-end API

  • New method to preprocess data for OWC software. This method can preprocessed Argo data and possibly create float_source/<WMO>.mat files to be used as inputs for OWC implementations in Matlab and Python. See the Salinity calibration documentation page for more. (#142) by G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
ds = ArgoDataFetcher(mode='expert').float(6902766).load().data
ds.argo.create_float_source("float_source")
ds.argo.create_float_source("float_source", force='raw')
ds_source = ds.argo.create_float_source()

This new method comes with others methods and improvements:

  • New dataset properties accessible from the argo xarray accessor: N_POINTS, N_LEVELS, N_PROF. Note that depending on the format of the dataset (a collection of points or of profiles) these values do or do not take into account NaN. These information are also visible by a simple print of the accessor. (#142) by G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
ds = ArgoDataFetcher(mode='expert').float(6902766).load().data
ds.argo.N_POINTS
ds.argo.N_LEVELS
ds.argo.N_PROF
ds.argo
  • New plotter function argopy.plotters.open_sat_altim_report() to insert the CLS Satellite Altimeter Report figure in a notebook cell. (#159) by G. Maze.

from argopy.plotters import open_sat_altim_report
open_sat_altim_report(6902766)
open_sat_altim_report([6902766, 6902772, 6902914])
open_sat_altim_report([6902766, 6902772, 6902914], embed='dropdown')  # Default
open_sat_altim_report([6902766, 6902772, 6902914], embed='slide')
open_sat_altim_report([6902766, 6902772, 6902914], embed='list')
open_sat_altim_report([6902766, 6902772, 6902914], embed=None)

from argopy import DataFetcher
from argopy import IndexFetcher
DataFetcher().float([6902745, 6902746]).plot('qc_altimetry')
IndexFetcher().float([6902745, 6902746]).plot('qc_altimetry')
from argopy import TopoFetcher
box = [-75, -45, 20, 30]
ds = TopoFetcher(box).to_xarray()
ds = TopoFetcher(box, ds='gebco', stride=[10, 10], cache=True).to_xarray()

For convenience we also added a new property to the data fetcher that return the domain covered by the dataset.

loader = ArgoDataFetcher().float(2901623)
loader.domain  # Returns [89.093, 96.036, -0.278, 4.16, 15.0, 2026.0, numpy.datetime64('2010-05-14T03:35:00.000000000'),  numpy.datetime64('2013-01-01T01:45:00.000000000')]

Internals

v0.1.8 (2 Nov. 2021)#

Features and front-end API

  • Improve plotting functions. All functions are now available for both the index and data fetchers. See the Data visualisation page for more details. Reduced plotting dependencies to Matplotlib only. Argopy will use Seaborn and/or Cartopy if available. (#56) by G. Maze.

from argopy import IndexFetcher as ArgoIndexFetcher
from argopy import DataFetcher as ArgoDataFetcher
obj = ArgoIndexFetcher().float([6902766, 6902772, 6902914, 6902746])
# OR
obj = ArgoDataFetcher().float([6902766, 6902772, 6902914, 6902746])

fig, ax = obj.plot()
fig, ax = obj.plot('trajectory')
fig, ax = obj.plot('trajectory', style='white', palette='Set1', figsize=(10,6))
fig, ax = obj.plot('dac')
fig, ax = obj.plot('institution')
fig, ax = obj.plot('profiler')
from argopy import DataFetcher as ArgoDataFetcher
loader = ArgoDataFetcher().float([6902766, 6902772, 6902914, 6902746])
loader.load()
loader.data
loader.index
loader.to_index()
from argopy import IndexFetcher as ArgoIndexFetcher
indexer = ArgoIndexFetcher().float([6902766, 6902772])
indexer.load()
indexer.index
  • Add optional speed of sound computation to xarray accessor teos10 method. (#90) by G. Maze.

  • Code spell fixes (#89) by K. Schwehr.

Internals

  • Check validity of access points options (WMO and box) in the facade, no checks at the fetcher level. (#92) by G. Maze.

  • More general options. Fix #91. (#102) by G. Maze.

    • trust_env to allow for local environment variables to be used by fsspec to connect to the internet. Useful for those using a proxy.

  • Documentation on Read The Docs now uses a pip environment and get rid of memory eager conda. (#103) by G. Maze.

  • xarray.Dataset argopy accessor argo has a clean documentation.

Breaking changes with previous versions

  • Drop support for python 3.6 and older. Lock range of dependencies version support.

  • In the plotters module, the plot_dac and plot_profilerType functions have been replaced by bar_plot. (#56) by G. Maze.

Internals

  • Internal logging available and upgrade dependencies version support (#56) by G. Maze. To see internal logs, you can set-up your application like this:

import logging
DEBUGFORMATTER = '%(asctime)s [%(levelname)s] [%(name)s] %(filename)s:%(lineno)d: %(message)s'
logging.basicConfig(
    level=logging.DEBUG,
    format=DEBUGFORMATTER,
    datefmt='%m/%d/%Y %I:%M:%S %p',
    handlers=[logging.FileHandler("argopy.log", mode='w')]
)

v0.1.7 (4 Jan. 2021)#

Long due release !

Features and front-end API

import argopy
argopy.status()
# or
argopy.status(refresh=15)
_images/status_monitor.png
  • Optimise large data fetching with parallelization, for all data fetchers (erddap, localftp and argovis). See documentation page on Parallel data fetching. Two parallel methods are available: multi-threading or multi-processing. (#28) by G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
loader = ArgoDataFetcher(parallel=True)
loader.float([6902766, 6902772, 6902914, 6902746]).to_xarray()
loader.region([-85,-45,10.,20.,0,1000.,'2012-01','2012-02']).to_xarray()

Breaking changes with previous versions

  • In the teos10 xarray accessor, the standard_name attribute will now be populated using values from the CF Standard Name table if one exists. The previous values of standard_name have been moved to the long_name attribute. (#74) by A. Barna.

  • The unique resource identifier property is now named uri for all data fetchers, it is always a list of strings.

Internals

  • New open_mfdataset and open_mfjson methods in Argo stores. These can be used to open, pre-process and concatenate a collection of paths both in sequential or parallel order. (#28) by G. Maze.

  • Unit testing is now done on a controlled conda environment. This allows to more easily identify errors coming from development vs errors due to dependencies update. (#65) by G. Maze.

v0.1.6 (31 Aug. 2020)#

  • JOSS paper published. You can now cite argopy with a clean reference. (#30) by G. Maze and K. Balem.

Maze G. and Balem K. (2020). argopy: A Python library for Argo ocean data analysis. Journal of Open Source Software, 5(52), 2425 doi: 10.21105/joss.02425.

v0.1.5 (10 July 2020)#

Features and front-end API

  • A new data source with the argovis data fetcher, all access points available (#24). By T. Tucker and G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
loader = ArgoDataFetcher(src='argovis')
loader.float(6902746).to_xarray()
loader.profile(6902746, 12).to_xarray()
loader.region([-85,-45,10.,20.,0,1000.,'2012-01','2012-02']).to_xarray()
  • Easily compute TEOS-10 variables with new argo accessor function teos10. This needs gsw to be installed. (#37) By G. Maze.

from argopy import DataFetcher as ArgoDataFetcher
ds = ArgoDataFetcher().region([-85,-45,10.,20.,0,1000.,'2012-01','2012-02']).to_xarray()
ds = ds.argo.teos10()
ds = ds.argo.teos10(['PV'])
ds_teos10 = ds.argo.teos10(['SA', 'CT'], inplace=False)
conda install -c conda-forge argopy

Breaking changes with previous versions

  • The local_ftp option of the localftp data source must now points to the folder where the dac directory is found. This breaks compatibility with rsynced local FTP copy because rsync does not give a dac folder (e.g. #33). An instructive error message is raised to notify users if any of the DAC name is found at the n-1 path level. (#34).

Internals

  • Implement a webAPI availability check in unit testing. This allows for more robust erddap and argovis tests that are not only based on internet connectivity only. (@5a46a39).

v0.1.4 (24 June 2020)#

Features and front-end API

  • Standard levels interpolation method available in standard user mode (#23). By K. Balem.

ds = ArgoDataFetcher().region([-85,-45,10.,20.,0,1000.,'2012-01','2012-12']).to_xarray()
ds = ds.argo.point2profile()
ds_interp = ds.argo.interp_std_levels(np.arange(0,900,50))
import argopy
argopy.dashboard()
# or
argopy.dashboard(wmo=6902746)
  • The localftp index and data fetcher now have the region and profile access points available (#25). By G. Maze.

Breaking changes with previous versions

[None]

Internals

  • Now uses fsspec as file system for caching as well as accessing local and remote files (#19). This closes issues #12, #15 and #17. argopy fetchers must now use (or implement if necessary) one of the internal file systems available in the new module argopy.stores. By G. Maze.

  • Erddap fetcher now uses netcdf format to retrieve data (#19).

v0.1.3 (15 May 2020)#

Features and front-end API

  • New index fetcher to explore and work with meta-data (#6). By K. Balem.

from argopy import IndexFetcher as ArgoIndexFetcher
idx = ArgoIndexFetcher().float(6902746)
idx.to_dataframe()
idx.plot('trajectory')

The index fetcher can manage caching and works with both Erddap and localftp data sources. It is basically the same as the data fetcher, but do not load measurements, only meta-data. This can be very useful when looking for regional sampling or trajectories.

Tip

Performance: we recommend to use the localftp data source when working this index fetcher because the erddap data source currently suffers from poor performances. This is linked to #16 and is being addressed by Ifremer.

The index fetcher comes with basic plotting functionalities with the argopy.IndexFetcher.plot() method to rapidly visualise measurement distributions by DAC, latitude/longitude and floats type.

Warning

The design of plotting and visualisation features in argopy is constantly evolving, so this may change in future releases.

Breaking changes with previous versions

  • The backend option in data fetchers and the global option datasrc have been renamed to src. This makes the code more coherent (@ec6b32e).

Code management

v0.1.2 (15 May 2020)#

We didn’t like this one this morning, so we move one to the next one !

v0.1.1 (3 Apr. 2020)#

Features and front-end API

  • Added new data fetcher backend localftp in DataFetcher (@c5f7cb6):

from argopy import DataFetcher as ArgoDataFetcher
argo_loader = ArgoDataFetcher(backend='localftp', path_ftp='/data/Argo/ftp_copy')
argo_loader.float(6902746).to_xarray()
  • Introduced global OPTIONS to set values for: cache folder, dataset (eg:phy or bgc), local ftp path, data fetcher (erddap or localftp) and user level (standard or expert). Can be used in context with (@83ccfb5):

with argopy.set_options(mode='expert', datasrc='erddap'):
    ds = argopy.DataFetcher().float(3901530).to_xarray()
  • Added a argopy.tutorial module to be able to load sample data for documentation and unit testing (@4af09b5):

ftproot, flist = argopy.tutorial.open_dataset('localftp')
txtfile = argopy.tutorial.open_dataset('weekly_index_prof')
  • Improved xarray argo accessor. Added methods for casting data types, to filter variables according to data mode, to filter variables according to quality flags. Useful methods to transform collection of points into collection of profiles, and vice versa (@14cda55):

ds = argopy.DataFetcher().float(3901530).to_xarray() # get a collection of points
dsprof = ds.argo.point2profile() # transform to profiles
ds = dsprof.argo.profile2point() # transform to points
  • Changed License from MIT to Apache (@25f90c9)

Internal machinery

  • Add __all__ to control from argopy import * (@83ccfb5)

  • All data fetchers inherit from class ArgoDataFetcherProto in proto.py (@44f45a5)

  • Data fetchers use default options from global OPTIONS

  • In Erddap fetcher: methods to cast data type, to filter by data mode and by QC flags are now delegated to the xarray argo accessor methods.

  • Data fetchers methods to filter variables according to user mode are using variable lists defined in utilities.

  • argopy.utilities augmented with listing functions of: backends, standard variables and multiprofile files variables.

  • Introduce custom errors in errors.py (@2563c9f)

  • Front-end API ArgoDataFetcher uses a more general way of auto-discovering fetcher backend and their access points. Turned of the deployments access point, waiting for the index fetcher to do that.

  • Improved xarray argo accessor. More reliable point2profile and data type casting with cast_type

Code management

  • Add CI with github actions (@ecbf9ba)

  • Contribution guideline for data fetchers (@b332495)

  • Improve unit testing (all along commits)

  • Introduce code coverage (@b490ab5)

  • Added explicit support for python 3.6 , 3.7 and 3.8 (@58f60fe)

v0.1.0 (17 Mar. 2020)#

  • Initial release.

  • Erddap data fetcher