home / github / commits

Menu
  • GraphQL API
  • Search all tables

commits: ba989f65e800c1dd5a308c7f14bda89963ee2bd5

This data as json

sha message author_date committer_date raw_author raw_committer repo author committer
ba989f65e800c1dd5a308c7f14bda89963ee2bd5 Update signature open_dataset for API v2 (#4547) * add in api.open_dataset dispatching to stub apiv2 * remove in apiv2 check for input AbstractDataStore * bugfix typo * add kwarg engines in _get_backend_cls needed by apiv2 * add alpha support for h5netcdf * style: clean not used code, modify some variable/function name * Add ENGINES entry for cfgrib. * Define function open_backend_dataset_cfgrib() to be used in apiv2.py. Add necessary imports for this function. * Apply black to check formatting. * Apply black to check formatting. * add dummy zarr apiv2 backend * align apiv2.open_dataset to api.open_dataset * remove unused extra_coords in open_backend_dataset_* * remove extra_coords in open_backend_dataset_cfgrib * transform zarr maybe_chunk and get_chunks in classmethod - to be used in apiv2 without instantiate the object * make alpha zarr apiv2 working * refactor apiv2.open_dataset: - modify signature - move default setting inside backends * move dataset_from_backend_dataset out of apiv2.open_dataset * remove blank lines * remove blank lines * style * Re-write error messages * Fix code style * Fix code style * remove unused import * replace warning with ValueError for not supported kwargs in backends * change zarr.ZarStore.get_chunks into a static method * group `backend_kwargs` and `kwargs` in `extra_tokes` argument in apiv2.dataset_from_backend_dataset` * remove in open_backend_dayaset_${engine} signature kwarags and the related error message * black * Change signature of open_dataset function in apiv2 to include explicit decodings. * Set an alias for chunks='auto'. * Allign empty rows with previous version. * reverse changes in chunks management * move check on decoders from backends to open_dataset (apiv2) * update documentation * Change signature of open_dataset function in apiv2 to include explicit decodings. * Set an alias for chunks='auto'. * Allign empty rows with previous version. * reverse changes in chunks management * move check on decoders from backends to open_dataset (apiv2) * update documentation * change defaut value for decode_cf in open_dataset. The function bahaviour is unchanged. * Review docstring of open_dataset function. * bugfix typo * - add check on backends signatures - add plugins.py cotaining backneds info * - black isort * - add type declaration in plugins.py * Fix the type hint for ENGINES * Drop special case and simplify resolve_decoders_kwargs * isort Co-authored-by: TheRed86 <m.rossetti@bopen.eu> Co-authored-by: Alessandro Amici <a.amici@bopen.eu> 2020-11-06T14:43:09Z 2020-11-06T14:43:09Z 1c23ee020f875c83a25f902c0d72d85cbd48eee3 cd792325681cbad9f663f2879d8b69f1edbb678f 13221727 35919497 19864447
Powered by Datasette · Queries took 81.562ms · About: xarray-datasette