sha,message,author_date,committer_date,raw_author,raw_committer,repo,author,committer
7dbbdcafed7f796ab77039ff797bcd31d9185903,"Bugfix in list_engine (#4811)

* fix list_engine

* fix store engine and netcdf4

* reve

* revert changes in guess_engine

* add resister of backend if dependencies aere instralled

* style mypy

* fix import

* use import instead of importlib

* black

* replace ImportError with ModuleNotFoundError

* fix typo

* fix typos

* remove else

* Revert remove imports inside backends functions

* Revert remove imports inside cfgrib

* modify check on imports inside the backends

* remove not used import",2021-01-19T10:10:25Z,2021-01-19T10:10:25Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
ac234619d5471e789b0670a673084dbb01df4f9e,"Remove entrypoints in setup for internal backends (#4724)

* add a dictionary for internal backends

* remove entrypoints in setup.cfg

* create global variable BACKEND_ENTRYPOINT
move BackendEtrypoints in common to solve circular dependecy

* fix and update tests

* fix in tests_plugins to remove a warning",2020-12-24T16:29:44Z,2020-12-24T16:29:44Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
1525fb0b23b8e92420ab428dc3d918a658e92dd4,"remove autoclose in open_dataset and related warning test (#4725)

* remove autoclose in open_dataset and related warning test

* black

* remove autoclose from open_mfdataset

* update what's new",2020-12-24T16:25:26Z,2020-12-24T16:25:26Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
03d8d56c9b6d090f0de2475202368b08435eaeb5,"Remove unexpected warnings in tests (#4728)

* fix test on chunking (warnings for not close files)

* add filterwarnings in test_chunking_consistency

* add filterwarnings in test_remove_duplicates",2020-12-24T13:12:40Z,2020-12-24T13:12:40Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
ed0dadc273fc05766ec7e73a6980e02a8a360069,"Fix warning on chunks compatibility (#4726)

* fix warning on last chunk and modify test

* black

* update tests

* style: rename variable",2020-12-24T11:32:42Z,2020-12-24T11:32:42Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
5179cd92fd0d5438e2b7366619e21a242d0d55c3,"Remove close_on_error store.py (#4719)

* remove close on error for external store

* remove not used import",2020-12-22T14:31:04Z,2020-12-22T14:31:04Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
138679748558f41cd28f82a25046bc96b1c4d1ef,"Port all the engines to apiv2 (#4673)

* first draft: not working

* fix and black

* remove test_autoclose_future_warning

* remove blanc lines

* fix

* reverted delete autoclose test. Added warning in apiv2

* isort

* revert delete test_autoclose_future_warning

* for backward compatibility: add store-backend
used in open_dataset if a store is passed instead a file_name and an engine

* fix default in store.py

* remove not open_dataset_paramenters definition in store.py

* remove ""**kwrargs"" in pseudonetcdf_backend.open_dataset_parameter definition

* add a comment to explain why the open_dataset_parameters are explicity defined in peseudonetcdf_.py

* use store.open_backend_dataset_store to reduce duplicated code inside backends

* style

Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-12-17T16:21:57Z,2020-12-17T16:21:57Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
91318d2ee63149669404489be9198f230d877642,"add encodings[""preferred_chunks""], used in open_dataset instead of en… (#4669)

* add encodings[""preferred_chunks""], used in open_dataset instead of encodings[""chunks""]

* modify preferred_chunks: now it's a dictionary not a list

* fix if preferred_chunks is not defined",2020-12-17T16:05:56Z,2020-12-17T16:05:56Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
76d5c0c075628475b555997b82c55dd18a34936e,"change default in ds.chunk and datarray.chunk variable.chunk (#4633)

* change default in ds.chunk and datarray.chunk variable.chunk and add warning

* add comment for chunks={} in dataset.chunk and dataarray.chunk

* fix: revert delete line",2020-12-10T10:38:05Z,2020-12-10T10:38:05Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
6d4a292f65cca30647fd222109325b6d5c3154ea,unify zarr chunking with other chunking in apiv2.open_dataset (#4667),2020-12-10T10:18:46Z,2020-12-10T10:18:46Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
74dffffbfea2ba9aea18ce194fe868f2cb00907d,"Backends entrypoints (#4577)

* Define _get_backends_cls function inside apiv2.py to read engines from plugins.py

* Read open_backends_dataset_* from entrypoints.

* Add backend entrypoints in setup.cfg

* Pass apiv2.py isort and black formatting tests.

* add dependencies

* add backend entrypoints and check on conflicts

* black

* removed global variable EMGINES
add class for entrypointys

* black isort

* add detect_engines in __all__ init.py

* removed entrypoints in py36-bare-minimum.yml and py36-min-all-deps.yml

* add entrypoints in IGNORE_DEPS

* Plugins test (#20)

- replace entrypoints with pkg_resources
- add tests

* fix typo

Co-authored-by: keewis <keewis@users.noreply.github.com>

* style

Co-authored-by: keewis <keewis@users.noreply.github.com>

* style

* Code style

* Code style

* fix: updated plugins.ENGINES with plugins.list_engines()

* fix

* One more correctness fix of the latest merge from master

Co-authored-by: TheRed86 <m.rossetti@bopen.eu>
Co-authored-by: keewis <keewis@users.noreply.github.com>
Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-12-10T09:56:12Z,2020-12-10T09:56:12Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
9802411b35291a6149d850e8e573cde71a93bfbf,"Modify zarr chunking as suggested in #4496 (#4646)

* modify get_chunks to align zarr chunking as described in issue #4496

* fix: maintain old open_zarr chunking interface

* add and fix tests

* black

* bugfix

* add few documentation on open_dataset chunking

* in test: re-add xafils for negative steps without dask

* Specify in reason that only zarr is expected to fail

* unify backend test negative_step with dask and without dask

* Add comment on has_dask usage

Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-12-09T12:26:44Z,2020-12-09T12:26:44Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
8ac3d862197204e6212a9882051808eb4b1cf3ff,"Refactor apiv2.open_dataset (#4642)

* in apiv2: rename ds in backend_ds and ds2 in ds

* add function _chunks_ds to simplify dataset_from_backend_dataset

* add small function _get_mtime to simplify _chunks_ds

* make resolve_decoders_kwargs and dataset_from_backend_dataset private",2020-12-02T13:17:26Z,2020-12-02T13:17:26Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
65308954787d313d81ced5fe33e6a4a49bcc2167,"Move get_chunks from zarr.py to dataset.py (#4632)

* move get_chunks from zarr to dateset and removed maybe_chunks in zarr

* move get_chunks from zarr to dateset and removed maybe_chunks in zarr

* black

* removed not used import

* update warning message in get_chunks

* Reformat warning text to use f-strings

Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-12-02T09:25:00Z,2020-12-02T09:25:00Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
ba989f65e800c1dd5a308c7f14bda89963ee2bd5,"Update signature open_dataset for API v2 (#4547)

* add in api.open_dataset dispatching to stub apiv2

* remove in apiv2 check for input AbstractDataStore

* bugfix typo

* add kwarg engines in _get_backend_cls needed by apiv2

* add alpha support for h5netcdf

* style: clean not used code, modify some variable/function name

* Add ENGINES entry for cfgrib.

* Define function open_backend_dataset_cfgrib() to be used in apiv2.py.
Add necessary imports for this function.

* Apply black to check formatting.

* Apply black to check formatting.

* add dummy zarr apiv2 backend

* align apiv2.open_dataset to api.open_dataset

* remove unused extra_coords in open_backend_dataset_*

* remove extra_coords in open_backend_dataset_cfgrib

* transform zarr maybe_chunk and get_chunks in classmethod
- to be used in apiv2 without instantiate the object

* make alpha zarr apiv2 working

* refactor apiv2.open_dataset:
- modify signature
- move default setting inside backends

* move dataset_from_backend_dataset out of apiv2.open_dataset

* remove blank lines

* remove blank lines

* style

* Re-write error messages

* Fix code style

* Fix code style

* remove unused import

* replace warning with ValueError for not supported kwargs in backends

* change zarr.ZarStore.get_chunks into a static method

* group `backend_kwargs` and `kwargs` in `extra_tokes` argument in apiv2.dataset_from_backend_dataset`

* remove in open_backend_dayaset_${engine} signature kwarags and the related error message

* black

* Change signature of open_dataset function in apiv2 to include explicit decodings.

* Set an alias for chunks='auto'.

* Allign empty rows with previous version.

* reverse changes in chunks management

* move check on decoders from backends to open_dataset (apiv2)

* update documentation

* Change signature of open_dataset function in apiv2 to include explicit decodings.

* Set an alias for chunks='auto'.

* Allign empty rows with previous version.

* reverse changes in chunks management

* move check on decoders from backends to open_dataset (apiv2)

* update documentation

* change defaut value for decode_cf in open_dataset. The function bahaviour is unchanged.

* Review docstring of open_dataset function.

* bugfix typo

* - add check on backends signatures
- add plugins.py cotaining backneds info

* - black isort

* - add type declaration in plugins.py

* Fix the type hint for ENGINES

* Drop special case and simplify resolve_decoders_kwargs

* isort

Co-authored-by: TheRed86 <m.rossetti@bopen.eu>
Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-11-06T14:43:09Z,2020-11-06T14:43:09Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
cc271e61077c543e0f3b1a06ad5e905ea2c91617,"WIP: Proposed refactor of read API for backends (#4477)

* add in api.open_dataset dispatching to stub apiv2

* remove in apiv2 check for input AbstractDataStore

* bugfix typo

* add kwarg engines in _get_backend_cls needed by apiv2

* add alpha support for h5netcdf

* style: clean not used code, modify some variable/function name

* Add ENGINES entry for cfgrib.

* Define function open_backend_dataset_cfgrib() to be used in apiv2.py.
Add necessary imports for this function.

* Apply black to check formatting.

* Apply black to check formatting.

* add dummy zarr apiv2 backend

* align apiv2.open_dataset to api.open_dataset

* remove unused extra_coords in open_backend_dataset_*

* remove extra_coords in open_backend_dataset_cfgrib

* transform zarr maybe_chunk and get_chunks in classmethod
- to be used in apiv2 without instantiate the object

* make alpha zarr apiv2 working

* refactor apiv2.open_dataset:
- modify signature
- move default setting inside backends

* move dataset_from_backend_dataset out of apiv2.open_dataset

* remove blank lines

* remove blank lines

* style

* Re-write error messages

* Fix code style

* Fix code style

* remove unused import

* replace warning with ValueError for not supported kwargs in backends

* change zarr.ZarStore.get_chunks into a static method

* group `backend_kwargs` and `kwargs` in `extra_tokes` argument in apiv2.dataset_from_backend_dataset`

* remove in open_backend_dayaset_${engine} signature kwarags and the related error message

* black

* Try add a strategy with an environment variable

* Try add a strategy with an environment variable

* black

Co-authored-by: TheRed86 <m.rossetti@bopen.eu>
Co-authored-by: Alessandro Amici <a.amici@bopen.eu>",2020-10-22T15:06:38Z,2020-10-22T15:06:38Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
49e3032ddfa3fe86361300fd08db4764ee718bf1,"Remove maybe chunck duplicated function (#4494)

* move functions selkeys and maybechunk outside dataset.chunk
add in maybechunk the key overwrite_encoded_chunks for zarr

* replace ZarrStore.maybe_chunk with dataset._maybe_chunck + ZarrStore.get_chunks

* remove no more used ZarrStore.maybe_chunk

* style

* style

* style

* fix typo

* move `dataset._selkeys` logic inside _maybe_chunk",2020-10-08T15:10:45Z,2020-10-08T15:10:45Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447
742d00076c8e79cb753b4b4856dbbef5f52878c6,"#1621 optional decode timedelta (#4071)

* add decode_timedelta kwarg in decode_cf and open_* functions and test.

* Fix style issue

* Add chang author reference

* removed check decode_timedelta in open_dataset

* fix docstring indentation

* fix: force dtype in test decode_timedelta",2020-05-19T15:43:53Z,2020-05-19T15:43:53Z,1c23ee020f875c83a25f902c0d72d85cbd48eee3,cd792325681cbad9f663f2879d8b69f1edbb678f,13221727,35919497,19864447