id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 297631403,MDExOlB1bGxSZXF1ZXN0MTY5NTEyMjU1,1915,h5netcdf new API support,6213168,closed,0,,,13,2018-02-15T23:15:55Z,2018-05-11T23:49:00Z,2018-05-08T02:25:40Z,MEMBER,,0,pydata/xarray/pulls/1915,"Closes #1536 Support arbitrary compression plugins through the h5netcdf new API. Done: - public API and docstrings (untested) - implementation - unit tests - What's New","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1915/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 320007162,MDU6SXNzdWUzMjAwMDcxNjI=,2102,resample DeprecationWarning only on 1-D arrays?,17162724,closed,0,,,1,2018-05-03T17:13:55Z,2018-05-08T17:36:22Z,2018-05-08T17:36:22Z,CONTRIBUTOR,,,,"#### Code Sample, a copy-pastable example if possible ```python da = xr.DataArray(np.array([1,2,3,4], dtype=np.float).reshape(2,2), ... coords=[pd.date_range('1/1/2000', '1/2/2000', freq='D'), ... np.linspace(0,1,num=2)], ... dims=['time', 'latitude']) da.resample(freq='M', dim='time', how='mean') #/Users/Ray/anaconda/envs/rot-eof-dev-env/bin/ipython:1: DeprecationWarning: #.resample() has been modified to defer calculations. Instead of passing 'dim' and 'how=""mean"", #instead consider using .resample(time=""M"").mean() # #!/Users/Ray/anaconda/envs/rot-eof-dev-env/bin/python #Out[66]: # #array([[2., 3.]]) #Coordinates: # * time (time) datetime64[ns] 2000-01-31 # * latitude (latitude) float64 0.0 1.0 da.resample(time=""M"").mean() # #array([2.5]) #Coordinates: # * time (time) datetime64[ns] 2000-01-31 ``` #### Problem description The DeprecationWarning example seems to only work for 1d arrays as it doesn't average along any dimension. A quick fix could be to show the warning only if the DataArray/Dataset is 1D. A more thorough fix could be to wrap `.resample(time=""M"").mean()` as `.resample(freq='M', dim='time', how='mean')`??? #### Expected Output Same as `da.resample(freq='M', dim='time', how='mean')` #### Output of ``xr.show_versions()``
xr.show_versions() # Not sure about the h5py FutureWarning? /Users/Ray/anaconda/envs/rot-eof-dev-env/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Darwin OS-release: 17.5.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.3 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.1 netCDF4: 1.3.1 h5netcdf: 0.5.1 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.2 distributed: 1.21.6 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: None setuptools: 39.0.1 pip: 9.0.3 conda: None pytest: None IPython: 6.3.1 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 319985789,MDExOlB1bGxSZXF1ZXN0MTg1NzY2MTg3,2101,DOC: Add resample e.g. Edit rolling e.g. Add groupby e.g.,17162724,closed,0,,,4,2018-05-03T16:08:48Z,2018-05-08T15:46:17Z,2018-05-08T04:23:03Z,CONTRIBUTOR,,0,pydata/xarray/pulls/2101," - [NA] Closes #xxxx (remove if there is no corresponding issue, which should only be the case for minor changes) - [NA] Tests added (for all bug fixes or enhancements) - [NA] Tests passed (for all non-documentation changes) - [NA] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later) Added a `resample` example to reflect the `DeprecationWarning: .resample() has been modified to defer calculations. Instead of passing 'dim' and 'how=""mean"", instead consider using...`. There were also some missing parameters in the docs. Made a minor edit to my `rolling` example which uses the parameter `center`. I can't remember if that parameter was there last time but I think it's useful to return the time value for the middle of window. Added a `groupby` example. Not sure if this should go here. See text below for my motivation of putting an example in the docstring of the class/object. I learnt `xarray` before I learnt `pandas` so these example will hopefully be useful to other beginners as it took me a minute to get the syntax right. Whilst there are great examples of these functions in the docs, a google search of a function for example `xarray rolling` often returns the doc entry that that object. Having a little example there is helpful for me to get the syntax right. The key/value pair in a dictionary for example is very powerful and always takes me a few tries to get the entry right.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2101/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 253476466,MDU6SXNzdWUyNTM0NzY0NjY=,1536,Better compression algorithms for NetCDF,6213168,closed,0,,,28,2017-08-28T22:35:31Z,2018-05-08T02:25:40Z,2018-05-08T02:25:40Z,MEMBER,,,,"As of today, ``Dataset.to_netcdf()`` exclusively allows writing uncompressed or compressed with zlib. zlib was absolutely revolutionary when it was released... in 1995. Time has passed, and much better compression algorithms have appeared over time. Good news is, h5py supports LZF out of the box, and is extensible with plugins to support theoretically any other algorithm. h5netcdf exposes such interface through its new (non-legacy) API; however ``Dataset.to_netcdf(engine='h5netcdf')`` supports the legacy API exclusively. I already tested that, once you manage to write to disk with LZF (using h5netcdf directly), ``open_dataset(engine='h5netcdf')`` transparently opens the compressed store. Options: - write a new engine for ``Dataset.to_netcdf()`` to support the new h5netcdf API. - switch the whole ``engine='h5netcdf'`` to the new API. Drop support for the old parameters in ``to_netcdf()``. This is less bad than it sounds, as people can switch to another engine in case of trouble. This is the cleanest solution, but also the most disruptive one. - switch the whole ``engine='h5netcdf'`` to the new API; have ``to_netcdf()`` accept both new and legacy parameters, and implement a translation layer of parameters from the legacy API to the new API. The benefit here is that, as long as the user sticks to the legacy API, he can hop between engines transparently. On the other hand I have a hard time believing anybody would care. - ? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1536/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue