id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 1362895131,I_kwDOAMm_X85RPCEb,6996,"attrs are now views, not copies",10050469,closed,0,,,5,2022-09-06T08:23:59Z,2022-09-06T10:45:43Z,2022-09-06T10:45:36Z,MEMBER,,,,"### What is your issue? I'm not sure yet if this is a feature or a bug - I would tend to the latter. Apologies if this has been discussed before. Objects originating from operations such as `y = x > 2` are now sharing the same `attrs`, which leads to things like: ```python import numpy as np import xarray as xr xr.__version__ '2022.6.0' x = xr.DataArray( 0.1 * np.arange(10), dims=[""lat""], coords={""lat"": np.arange(10)}, name=""sst"", ) x.lat.attrs['long_name'] = 'latitude' x.lat.attrs {'long_name': 'latitude'} y = x > 2 y.lat.attrs {'long_name': 'latitude'} y.lat.attrs = {} x.lat.attrs # x is changed as well! {} ``` I think this is rather a non-intuitive behavior but I'm happy to discuss!","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6996/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1051241489,I_kwDOAMm_X84-qKwR,5976,Should str.format() work on xarray scalars?,10050469,closed,0,,,3,2021-11-11T18:15:59Z,2022-07-25T20:01:29Z,2022-07-25T20:01:29Z,MEMBER,,,,"Consider: ```python da = xr.DataArray([1, 2, 3]) print(f'{da[0]}') print(f'{da[0]:d}') ``` Which outputs: ``` array(1) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in 1 da = xr.DataArray([1, 2, 3]) 2 print(f'{da[0]}') ----> 3 print(f'{da[0]:d}') TypeError: unsupported format string passed to DataArray.__format__ ``` And the numpy equivalent: ```python da = xr.DataArray([1, 2, 3]).data print(f'{da[0]}') print(f'{da[0]:d}') 1 1 ``` I always found the xarray scalar output to be a bit unfriendly for beginners. In my classes very often scalars are the last output of a computation, and the fact that we can't format the relatively verbose xarray output without resulting to the `.data` trick is a bit confusing for students (but I agree this is a detail). Is there a way to get `print(f'{da[0]:d}')` to work? Thoughts? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5976/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 1224347682,PR_kwDOAMm_X843PYZJ,6569,Add some warnings about rechunking to the docs,10050469,closed,0,,,3,2022-05-03T16:48:02Z,2022-05-10T05:54:13Z,2022-05-10T05:54:05Z,MEMBER,,0,pydata/xarray/pulls/6569," This adds some warnings at the right places when rechunking a dataset opened with `open_mfdataset` (see https://github.com/pangeo-data/rechunker/issues/100#issuecomment-1116189019 for context) Thanks to @dcherian for the wisdom of the day! ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/6569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1051772149,PR_kwDOAMm_X84ucj1i,5981,Allow string formatting of scalar DataArrays,10050469,closed,0,,,4,2021-11-12T09:44:43Z,2022-05-09T15:25:25Z,2022-05-09T15:25:02Z,MEMBER,,0,pydata/xarray/pulls/5981,"- [x] Closes https://github.com/pydata/xarray/issues/5976 - [x] Tests added - [x] Passes `pre-commit run --all-files` - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` This is a first try at formatting dataarray scalars. Here is the current behavior: ```python In [1]: import xarray as xr ...: import numpy as np In [2]: a = np.array(1) ...: da = xr.DataArray(a) In [3]: print(a) 1 In [4]: print(da) array(1) In [5]: print('{}'.format(a)) 1 In [6]: print('{}'.format(da)) array(1) In [7]: print('{:.3f}'.format(a)) 1.000 In [8]: print('{:.3f}'.format(da)) 1.000 In [9]: a = np.array([1, 2]) ...: da = xr.DataArray(a) In [10]: print('{}'.format(a)) [1 2] In [11]: print('{}'.format(da)) array([1, 2]) Dimensions without coordinates: dim_0 In [12]: print('{:.3f}'.format(a)) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in ----> 1 print('{:.3f}'.format(a)) TypeError: unsupported format string passed to numpy.ndarray.__format__ In [13]: print('{:.3f}'.format(da)) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in ----> 1 print('{:.3f}'.format(da)) ~/disk/Dropbox/HomeDocs/git/xarray/xarray/core/common.py in __format__(self, format_spec) 162 return formatting.array_repr(self) 163 # Else why fall back to numpy --> 164 return self.values.__format__(format_spec) 165 166 def _iter(self: Any) -> Iterator[Any]: TypeError: unsupported format string passed to numpy.ndarray.__format__ ``` I don't think there is any backwards compatibility issue but lets see if the tests pass","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5981/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 424916834,MDExOlB1bGxSZXF1ZXN0MjY0MTMxNTU3,2850,More informative error when writing attrs to netCDF,10050469,closed,0,,,8,2019-03-25T13:58:30Z,2022-05-03T16:16:21Z,2022-05-03T16:16:21Z,MEMBER,,0,pydata/xarray/pulls/2850,"Closes #3080 Some attribute names aren't valid netCDF4 keys. For example: ```python import xarray as xr ds = xr.Dataset({'x': ('y', [1, 2, 3], {'CLASS': 'foo'})}) ds.to_netcdf('test.nc') ``` will fail with: ```NetCDF: String match to name in use``` This is hard to debug if you don't know which attribute is faulty. With this small change the error should be more informative: ```AttributeError: The following exception occurred when attempting to set attribute (CLASS, foo): ""NetCDF: String match to name in use""``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2850/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1051366862,PR_kwDOAMm_X84ubTA8,5978,"Add ""see also"" in to_dataframe docs",10050469,closed,0,,,0,2021-11-11T21:15:15Z,2021-11-13T17:35:43Z,2021-11-13T17:35:43Z,MEMBER,,0,pydata/xarray/pulls/5978,"A very modest contribution... I miss contributing to xarray! Believe it or not, I did not know about `to_pandas` until today... ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5978/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 256496917,MDU6SXNzdWUyNTY0OTY5MTc=,1565,Regression: time attributes on PeriodIndex,10050469,open,0,,,12,2017-09-10T09:27:09Z,2021-07-20T18:33:29Z,,MEMBER,,,,"The following used to work with xarray 0.9.5 but doesn't anymore with 0.9.6 or master: ```python import xarray as xr import pandas as pd import numpy as np time = pd.period_range('2000-01', '2000-12', freq='M') da = xr.DataArray(np.arange(12), dims=['time'], coords={'time':time}) da['time.month'] ``` ``` --------------------------------------------------------------------------- KeyError Traceback (most recent call last) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in _getitem_coord(self, key) 458 try: --> 459 var = self._coords[key] 460 except KeyError: KeyError: 'time.month' During handling of the above exception, another exception occurred: AttributeError Traceback (most recent call last) in () 4 time = pd.period_range('2000-01', '2000-12', freq='M') 5 da = xr.DataArray(np.arange(12), dims=['time'], coords={'time':time}) ----> 6 da['time.month'] ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in __getitem__(self, key) 467 def __getitem__(self, key): 468 if isinstance(key, basestring): --> 469 return self._getitem_coord(key) 470 else: 471 # orthogonal array indexing ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in _getitem_coord(self, key) 461 dim_sizes = dict(zip(self.dims, self.shape)) 462 _, key, var = _get_virtual_variable( --> 463 self._coords, key, self._level_coords, dim_sizes) 464 465 return self._replace_maybe_drop_dims(var, name=key) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in _get_virtual_variable(variables, key, level_vars, dim_sizes) 82 data = getattr(ref_var.dt, var_name).data 83 else: ---> 84 data = getattr(ref_var, var_name).data 85 virtual_var = Variable(ref_var.dims, data) 86 AttributeError: 'IndexVariable' object has no attribute 'month' ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 840258082,MDU6SXNzdWU4NDAyNTgwODI=,5073,`lock` kwarg needs a deprecation cycle?,10050469,closed,0,,,6,2021-03-24T22:39:15Z,2021-05-04T14:31:10Z,2021-05-04T14:30:09Z,MEMBER,,,," Salem's tests on master fail because I use the `lock` kwarg to `open_dataset`, which seems to have disappeared in the backend refactoring. Should the new `open_dataset` simply ignore `lock`, and raise a `FutureWarning` when used?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/5073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 771127744,MDU6SXNzdWU3NzExMjc3NDQ=,4710,open_mfdataset -> to_netcdf() randomly leading to dead workers,10050469,closed,0,,,4,2020-12-18T19:42:14Z,2020-12-22T11:54:37Z,2020-12-22T11:54:37Z,MEMBER,,,,"This is: - xarray: 0.16.2 - dask: 2.30.0 I'm not sure a github issue is the right place to report this, but I'm not sure where else, so here it is. I just had two very long weeks of debugging stalled (i.e. ""dead"") OGGM jobs in a cluster environment. I finally nailed it down to `ds.to_netcdf(path)` in this situation: ```python with xr.open_mfdataset(tmp_paths, combine='nested', concat_dim='rgi_id') as ds: ds.to_netcdf(path) ``` `tmp_paths` are a few netcdf files (from 2 to about 60). The combined dataset is nothing close to big (a few hundred MB at most). Most of the time, this command works just fine. But in 30% of the cases, this would just... stop and stall. One or more of the workers would simply stop working without coming back or erroring. What I can give as additional information: - changing `ds.to_netcdf(path)` to `ds.load().to_netcdf(path)` solves the problem - the problem became *worse* (i.e. more often) when the files to concatenate increased in the number of variables (the final size of the concatenated file doesn't seem to matter at all, it occurs also with files < 1 MB) - I can't reproduce the problem locally. The files are [here](https://cluster.klima.uni-bremen.de/~fmaussion/misc/xarray_stall/) if someone's interested, but I don't think the files are the issue here. - the files use gzip compression - On cluster, we are dealing with 64 core nodes, which do a lot of work before arriving to these two lines. We use python multiprocessing ourselves before that, create our own pool and use it, etc. But at the moment the job hits these two lines, no other job is running. Is this is some kind of weird interaction between our own multiprocessing and dask? Is it more an IO problem that occurs only on cluster? I don't know. I know this is a crappy bug report, but the fact that I lost a lot of time on this recently has gone on my nerves :wink: (I'm mostly angry at myself for taking so long to find out that these two lines were the problem). In order to make a question out of this crappy report: **how can I possibly debug this?** I solved my problem now (with `ds.load()`), but this is not really satisfying. Any tip is appreciated! cc @TimoRoth our cluster IT whom I annoyed a lot before finding out that the problem was in xarray/dask","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4710/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 184456540,MDU6SXNzdWUxODQ0NTY1NDA=,1056,groupby_bins along two dims simultaneously,10050469,open,0,,,3,2016-10-21T10:50:06Z,2020-10-04T05:06:37Z,,MEMBER,,,,"I probably missed it, but what is the way to apply groupby (or rather groupby_bins) in order to achieve the following in xarray? ``` python da = xr.DataArray(np.arange(16).reshape((4, 4))) da array([[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11], [12, 13, 14, 15]]) Coordinates: * dim_0 (dim_0) int64 0 1 2 3 * dim_1 (dim_1) int64 0 1 2 3 # should be aggregated to (in case of summing) to obtain dagg array([[10, 18], [42, 50]]) Coordinates: * dim_1 (dim_1) int64 0 2 * dim_0 (dim_0) int64 0 2 ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1056/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,issue 177903376,MDU6SXNzdWUxNzc5MDMzNzY=,1009,Shouldn't .where() pass the attributes of DataArrays and DataSets? ,10050469,closed,0,,,4,2016-09-19T21:30:13Z,2020-04-05T19:11:46Z,2016-09-21T17:26:32Z,MEMBER,,,,"Everything is in the title! I think it should, if possible. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1009/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 247054695,MDU6SXNzdWUyNDcwNTQ2OTU=,1498,Update doc example for open_mfdataset,10050469,closed,0,,,1,2017-08-01T12:37:58Z,2019-08-01T13:13:36Z,2019-08-01T13:13:36Z,MEMBER,,,,"The current doc shows bits of code which are now irrelevant thanks to open_mfdataset: http://xarray.pydata.org/en/stable/io.html#id7 On a related note, it would be great to document the bottlenecks in concat / dask and how to overcome them. Related to https://github.com/pydata/xarray/issues/1391, https://github.com/pydata/xarray/issues/1495, and https://github.com/pydata/xarray/issues/1379 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1498/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 230214809,MDU6SXNzdWUyMzAyMTQ4MDk=,1418,Indexing time with lists,10050469,closed,0,,,3,2017-05-21T11:38:11Z,2019-06-29T01:58:33Z,2019-06-29T01:58:33Z,MEMBER,,,,"Is this a bug? Look the following example: ```python ds = xr.tutorial.load_dataset('air_temperature') ds.sel(time='2013-01-01T00:00') # works fine [output removed] ds.sel(time=['2013-01-01T00:00']) # errors Traceback (most recent call last): File ""/home/mowglie/.pycharm-community-2017.1/helpers/pydev/_pydevd_bundle/pydevd_exec2.py"", line 3, in Exec exec(exp, global_vars, local_vars) File """", line 1, in File ""/home/mowglie/Documents/git/xarray-official/xarray/core/dataset.py"", line 1206, in sel self, indexers, method=method, tolerance=tolerance File ""/home/mowglie/Documents/git/xarray-official/xarray/core/indexing.py"", line 290, in remap_label_indexers dim, method, tolerance) File ""/home/mowglie/Documents/git/xarray-official/xarray/core/indexing.py"", line 229, in convert_label_indexer % index_name) KeyError: ""not all values found in index 'time'"" ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 383791450,MDU6SXNzdWUzODM3OTE0NTA=,2565,CF conventions: time_bnds and time units ,10050469,closed,0,,,4,2018-11-23T11:32:37Z,2019-01-08T22:28:37Z,2019-01-08T22:28:37Z,MEMBER,,,,"### Problem Here is the dump of a NetCDF file ([download](https://github.com/OGGM/oggm-sample-data/raw/master/test-files/cesm.TREFHT.160001-200512.selection.nc)): ``` netcdf cesm.TREFHT.160001-200512.selection { dimensions: time = UNLIMITED ; // (4872 currently) lat = 3 ; lon = 3 ; nbnd = 2 ; variables: float TREFHT(time, lat, lon) ; TREFHT:units = ""K"" ; TREFHT:long_name = ""Reference height temperature"" ; TREFHT:cell_methods = ""time: mean"" ; double lat(lat) ; lat:long_name = ""latitude"" ; lat:units = ""degrees_north"" ; double lon(lon) ; lon:long_name = ""longitude"" ; lon:units = ""degrees_east"" ; double time(time) ; time:long_name = ""time"" ; time:units = ""days since 0850-01-01 00:00:00"" ; time:calendar = ""noleap"" ; time:bounds = ""time_bnds"" ; double time_bnds(time, nbnd) ; time_bnds:long_name = ""time interval endpoints"" ; // global attributes: :Conventions = ""CF-1.0"" ; :source = ""CAM"" ; ... } ``` When xarray decodes the time coordinates it also deletes the `time:units` attribute (this kind of makes sense, because the unit has no meaning when the time is converted to a CFTime object): ```python import xarray as xr ds = xr.open_dataset(f) ds.time array([cftime.DatetimeNoLeap(1600, 2, 1, 0, 0, 0, 0, 0, 32), cftime.DatetimeNoLeap(1600, 3, 1, 0, 0, 0, 0, 0, 60), cftime.DatetimeNoLeap(1600, 4, 1, 0, 0, 0, 0, 3, 91), ..., cftime.DatetimeNoLeap(2005, 11, 1, 0, 0, 0, 0, 6, 305), cftime.DatetimeNoLeap(2005, 12, 1, 0, 0, 0, 0, 1, 335), cftime.DatetimeNoLeap(2006, 1, 1, 0, 0, 0, 0, 4, 1)], dtype=object) Coordinates: * time (time) object 1600-02-01 00:00:00 ... 2006-01-01 00:00:00 Attributes: long_name: time bounds: time_bnds ``` The problem is that I have no way to actually decode the `time_bnds` variable from xarray alone now, because the `time_bnds` variable doesn't store the time units. First, I thought that my file was not CF compliant but I've looked into the CF conventions and it looks like they are not prescribing that `time_bnds` should *also* have a `units` attribute. ### Solution I actually don't know what we should do here. I see a couple of ways: 1. we don't care and leave it to the user (here: me) to open the file with netCDF4 to decode the time bounds 2. we don't delete the `time:units` attribute after decoding 3. we start to also decode the `time_bnds` when available, like we do with `time` Thoughts? cc @spencerkclark @jhamman ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 344879199,MDU6SXNzdWUzNDQ4NzkxOTk=,2316,rasterio released v1 as stable,10050469,closed,0,,,1,2018-07-26T14:53:54Z,2019-01-08T22:19:32Z,2019-01-08T22:19:32Z,MEMBER,,,,"conda-forge now ships v1.0.1 per default. After two years of betas and release candidates this is very welcome! We have very little code specifically handling pre-v1 and post-v1, but we should keep it around for a couple more months. Will update the CI to reflect this change. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2316/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 384004189,MDExOlB1bGxSZXF1ZXN0MjMzMzI4OTc4,2571,CF: also decode time bounds when available,10050469,closed,0,,,7,2018-11-24T16:50:13Z,2018-12-19T17:19:05Z,2018-12-19T17:19:05Z,MEMBER,,0,pydata/xarray/pulls/2571," - [x] Closes https://github.com/pydata/xarray/issues/2565 - [x] Tests added - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API Not sure if this is the best way to handle it, but it seems to work","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 379177627,MDU6SXNzdWUzNzkxNzc2Mjc=,2551,HDF Errors since xarray 0.11,10050469,closed,0,,,10,2018-11-09T14:14:11Z,2018-11-12T00:12:46Z,2018-11-11T12:10:36Z,MEMBER,,,,"(EDIT: sorry for unexpected early posting) I just wanted to open this issue here, just to see if it has some resonance in other projects. We are getting new unexpected HDF Errors in our test suite which are definitely due to the recent xarray update (reverting to 0.10.9 solves the problem). The error is the famous (and very informative): ``` [Errno -101] NetCDF: HDF error: ``` I have not been able to create a MWE yet, but it has something to do with read -> close -> append workflows on netcdf4 files (the error happens at the ""append"" step). Possibly multiprocessing also plays a role, but I can't be sure yet. I will try to find a way to reproduce this with a simple example, but this might take a while... ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 379410537,MDExOlB1bGxSZXF1ZXN0MjI5ODgyNDY1,2552,Attempt to reproduce HDF error,10050469,closed,0,,,1,2018-11-10T10:22:47Z,2018-11-11T12:10:46Z,2018-11-11T12:10:46Z,MEMBER,,0,pydata/xarray/pulls/2552,This is just a test to see if I can reproduce an HDF error reported in https://github.com/pydata/xarray/issues/2551,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 372213595,MDExOlB1bGxSZXF1ZXN0MjI0NDc1ODA5,2498,Small fix in rasterio docs,10050469,closed,0,,,1,2018-10-20T13:48:51Z,2018-10-22T00:01:09Z,2018-10-22T00:01:07Z,MEMBER,,0,pydata/xarray/pulls/2498," - [x] Closes https://github.com/pydata/xarray/issues/2497 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2498/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 362516560,MDU6SXNzdWUzNjI1MTY1NjA=,2428,Stickler-ci,10050469,closed,0,,,2,2018-09-21T08:50:25Z,2018-10-07T22:40:08Z,2018-10-07T22:40:08Z,MEMBER,,,,"The last time stickler had a look at our PRs is 8 days ago (https://github.com/pydata/xarray/pull/2415) : https://stickler-ci.com/repositories/26661-pydata-xarray It looks like their bot is broken: https://github.com/stickler-ci This is not very trustworthy - we could consider switching to https://pep8speaks.com/ maybe","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 361818115,MDU6SXNzdWUzNjE4MTgxMTU=,2422,Plot2D no longer sorts coordinates before plotting,10050469,closed,0,,,6,2018-09-19T16:00:56Z,2018-09-21T17:47:12Z,2018-09-21T17:47:12Z,MEMBER,,,,"I have a dataset with decreasing latitude coordinates. With ``xarray`` v0.10.8, this is what happens when plotting: ![i1](https://user-images.githubusercontent.com/10050469/45765619-b58ceb00-bc35-11e8-94b4-5e37cb9f2a0a.png) But on latest master the image is now upside down: ![i2](https://user-images.githubusercontent.com/10050469/45765681-e53bf300-bc35-11e8-95cc-8e4fb520bfe5.png) Sorry if I missed a change along the way, I was off for a long time. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 362158655,MDExOlB1bGxSZXF1ZXN0MjE2OTQ4OTgx,2425,Plotting: restore xyincrease kwarg default to True,10050469,closed,0,,,2,2018-09-20T12:12:48Z,2018-09-21T17:36:23Z,2018-09-21T17:36:21Z,MEMBER,,0,pydata/xarray/pulls/2425," - [x] Closes https://github.com/pydata/xarray/issues/2422 - [x] Tests added - [x] Tests passed - [x] Fully documented, including `whats-new.rst` https://github.com/pydata/xarray/pull/2294 introduced the new ``xincrease`` and ``yincrease`` kwargs and a behavior documented as: ``` xincrease : None, True, or False, optional Should the values on the x axes be increasing from left to right? if None, use the default for the matplotlib function. ``` The default was ``None``, I suggest to set it to ``True``: it was the default before https://github.com/pydata/xarray/pull/2294 and I think it is what most users expect. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 344883324,MDExOlB1bGxSZXF1ZXN0MjA0MTcyOTI5,2317,Remove test on rasterio rc and test for 0.36 instead,10050469,closed,0,,,1,2018-07-26T15:03:20Z,2018-07-30T11:05:37Z,2018-07-30T11:05:30Z,MEMBER,,0,pydata/xarray/pulls/2317,See https://github.com/pydata/xarray/issues/2316,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 344096196,MDExOlB1bGxSZXF1ZXN0MjAzNTcyNTkw,2310,Make RTD builds faster,10050469,closed,0,,,4,2018-07-24T15:45:17Z,2018-07-27T08:20:43Z,2018-07-27T08:20:43Z,MEMBER,,0,pydata/xarray/pulls/2310,"See https://github.com/pydata/xarray/issues/2306 This makes the builds a little bit faster. But we are still very close to the 900 s limit. I tried to pin more packages this morning but this didn't work out. I'll merge this first (the latest build worked with this config, https://readthedocs.org/projects/xray/builds/) and try again at a later stage. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 343944521,MDU6SXNzdWUzNDM5NDQ1MjE=,2306,Timeouts errors on readthedocs,10050469,closed,0,,,3,2018-07-24T08:55:23Z,2018-07-26T14:41:48Z,2018-07-26T14:41:48Z,MEMBER,,,,"We are reaching the 900s build time limit on readthedocs more often than not (https://readthedocs.org/projects/xray/builds/). I have the same problem with all of my OS projects. The bottleneck is the conda environment installation, which took 457s on the latest failed build. I'm going to try to spare some time in a subsequent PR, but we might have to get in touch with the RTD people to get more build time.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 343260451,MDExOlB1bGxSZXF1ZXN0MjAyOTY2Mjgx,2303,"Rename ""Recipes"" to ""Gallery""",10050469,closed,0,,,1,2018-07-20T22:03:27Z,2018-07-23T16:33:08Z,2018-07-23T16:33:08Z,MEMBER,,0,pydata/xarray/pulls/2303,"Simply because it makes more sense after all. The link address won't change so it's not a biggie (http://xarray.pydata.org/en/latest/auto_gallery/index.html)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 337202341,MDExOlB1bGxSZXF1ZXN0MTk4NDY5MjMw,2260,Plotting: do not check for monotonicity with 2D coords,10050469,closed,0,,,3,2018-06-30T09:47:20Z,2018-07-03T09:17:02Z,2018-07-03T09:16:27Z,MEMBER,,0,pydata/xarray/pulls/2260," - [x] Closes https://github.com/pydata/xarray/issues/2250 - [x] Tests added - [x] Tests passed - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 323944717,MDExOlB1bGxSZXF1ZXN0MTg4Njc2Nzgw,2146,Add favicon to docs?,10050469,closed,0,,,3,2018-05-17T09:35:21Z,2018-05-19T20:37:18Z,2018-05-17T16:55:31Z,MEMBER,,0,pydata/xarray/pulls/2146,"Don't know if we want this, but it's possible to replace the default RTD icon. I'm not even sure if the cool kids are using bookmarks anymore, but I still do. This is how it looks like on my Firefox fav tab: ![selection_054](https://user-images.githubusercontent.com/10050469/40169465-597840fc-59c6-11e8-8287-87f6663d05ea.png) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2146/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 297452821,MDU6SXNzdWUyOTc0NTI4MjE=,1912,Code review bots?,10050469,closed,0,,,4,2018-02-15T13:51:39Z,2018-05-01T07:24:00Z,2018-05-01T07:24:00Z,MEMBER,,,,"I'm seeing them from time to time on other repositories. One that seems reasonable and not toooo intrusive is stickler, for code style review: https://stickler-ci.com/ ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1912/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 318827030,MDExOlB1bGxSZXF1ZXN0MTg0OTA4ODE2,2091,DOC: uniformize variable names in indexing.rst,10050469,closed,0,,,1,2018-04-30T09:11:14Z,2018-04-30T17:17:53Z,2018-04-30T17:17:53Z,MEMBER,,0,pydata/xarray/pulls/2091," - [x] Closes https://github.com/pydata/xarray/issues/2088 https://github.com/pydata/xarray/commit/6402391cf206fd04c12d44773fecd9b42ea0c246 overwrote an array which was needed later on. In general the whole page was a bit messy with the same initial array being forgotten and then re-used much later. This PR attempts to uniformize the variable names in order to make self-consistent subsection examples. I hope I didn't brake something on the way, but it looks ok on my local build. (also renamed a gallery example plot which wasn't showing up correctly) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2091/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 308284379,MDExOlB1bGxSZXF1ZXN0MTc3MjQ5OTcx,2015,Fix an overflow bug in decode_cf_datetime,10050469,closed,0,,,4,2018-03-24T17:54:36Z,2018-03-31T01:16:15Z,2018-03-31T01:16:14Z,MEMBER,,0,pydata/xarray/pulls/2015," - [x] Closes https://github.com/pydata/xarray/issues/2002 - [x] Tests added - [x] Tests passed - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API Not sure yet if this is the best way to do this.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2015/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 302679890,MDU6SXNzdWUzMDI2Nzk4OTA=,1966,imshow should work with third dimension of len 1,10050469,closed,0,,,2,2018-03-06T12:22:57Z,2018-03-08T23:51:45Z,2018-03-08T23:51:45Z,MEMBER,,,,"#### Code Sample, a copy-pastable example if possible ```python import xarray as xr import numpy as np da = xr.DataArray(np.arange(9).reshape((1, 3, 3))) da.plot() # works da.plot.imshow() # fails ``` Error log:
``` /home/mowglie/Documents/git/xarray/xarray/plot/utils.py:295: UserWarning: Several dimensions of this array could be colors. Xarray will use the last possible dimension ('dim_2') to match matplotlib.pyplot.imshow. You can pass names of x, y, and/or rgb dimensions to override this guess. 'and/or rgb dimensions to override this guess.' % rgb) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () ----> 1 da.plot.imshow() ~/Documents/git/xarray/xarray/plot/plot.py in plotmethod(_PlotMethods_obj, x, y, figsize, size, aspect, ax, row, col, col_wrap, xincrease, yincrease, add_colorbar, add_labels, vmin, vmax, cmap, colors, center, robust, extend, levels, infer_intervals, subplot_kws, cbar_ax, cbar_kwargs, **kwargs) 679 for arg in ['_PlotMethods_obj', 'newplotfunc', 'kwargs']: 680 del allargs[arg] --> 681 return newplotfunc(**allargs) 682 683 # Add to class _PlotMethods ~/Documents/git/xarray/xarray/plot/plot.py in newplotfunc(darray, x, y, figsize, size, aspect, ax, row, col, col_wrap, xincrease, yincrease, add_colorbar, add_labels, vmin, vmax, cmap, center, robust, extend, levels, infer_intervals, colors, subplot_kws, cbar_ax, cbar_kwargs, **kwargs) 553 rgb = kwargs.pop('rgb', None) 554 xlab, ylab = _infer_xy_labels( --> 555 darray=darray, x=x, y=y, imshow=imshow_rgb, rgb=rgb) 556 557 if rgb is not None and plotfunc.__name__ != 'imshow': ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels(darray, x, y, imshow, rgb) 308 assert x is None or x != y 309 if imshow and darray.ndim == 3: --> 310 return _infer_xy_labels_3d(darray, x, y, rgb) 311 312 if x is None and y is None: ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels_3d(darray, x, y, rgb) 297 298 # Finally, we pick out the red slice and delegate to the 2D version: --> 299 return _infer_xy_labels(darray.isel(**{rgb: 0}).squeeze(), x, y) 300 301 ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels(darray, x, y, imshow, rgb) 312 if x is None and y is None: 313 if darray.ndim != 2: --> 314 raise ValueError('DataArray must be 2d') 315 y, x = darray.dims 316 elif x is None: ValueError: DataArray must be 2d ```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 302012794,MDExOlB1bGxSZXF1ZXN0MTcyNjc1Nzgy,1958,Update some packages on RTD,10050469,closed,0,,,0,2018-03-03T16:41:00Z,2018-03-03T16:41:10Z,2018-03-03T16:41:10Z,MEMBER,,0,pydata/xarray/pulls/1958,See https://github.com/pydata/xarray/pull/1957,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1958/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 296124381,MDExOlB1bGxSZXF1ZXN0MTY4NDE1NjA1,1902,_color_palette consistent with or without seaborn,10050469,closed,0,,,1,2018-02-10T19:00:59Z,2018-02-16T21:08:32Z,2018-02-16T21:08:32Z,MEMBER,,0,pydata/xarray/pulls/1902," - [x] Closes https://github.com/pydata/xarray/issues/1896 - [ ] Tests added - [ ] Tests passed - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API Instead of relying on seaborn per default we now use mpl and fall back to seaborn if cmap is not recognized (e.g. with `cmap=""husl""`).","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1902/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 293858326,MDU6SXNzdWUyOTM4NTgzMjY=,1880,Should imshow() recognise 0-255 images?,10050469,closed,0,,,1,2018-02-02T11:30:21Z,2018-02-12T22:12:13Z,2018-02-12T22:12:13Z,MEMBER,,,,"#### Code Sample, a copy-pastable example if possible ```python import os import urllib.request import xarray as xr import matplotlib.pyplot as plt # Download the file from rasterio's repository url = 'https://github.com/mapbox/rasterio/raw/master/tests/data/RGB.byte.tif' urllib.request.urlretrieve(url, 'RGB.byte.tif') # Read the data da = xr.open_rasterio('RGB.byte.tif') f, (ax1, ax2) = plt.subplots(1, 2, figsize=(9, 4)) da.plot.imshow(ax=ax1) (da / 255).plot.imshow(ax=ax2) plt.tight_layout() plt.show() # Delete the file os.remove('RGB.byte.tif') ``` ![figure_1](https://user-images.githubusercontent.com/10050469/35730809-c157c7ba-0813-11e8-8909-24b6d61e685c.png) #### Problem description In https://github.com/pydata/xarray/pull/1796, @Zac-HD added support for RGBA images. If an alpha channel is not found, it is added ([code](https://github.com/pydata/xarray/blob/master/xarray/plot/plot.py#L708)) The problem is that adding this alpha channel requires the images to be normalized to 0-1, while plotting an image in 0-255 range without alpha channel works fine in matplotlib. Removing https://github.com/pydata/xarray/blob/master/xarray/plot/plot.py#L708-L715 would solve the problem, but I guess it was added for a reason. @Zac-HD , thoughts? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 296303590,MDU6SXNzdWUyOTYzMDM1OTA=,1903,Broken distributed test,10050469,closed,0,,,2,2018-02-12T09:04:51Z,2018-02-12T21:08:05Z,2018-02-12T21:08:05Z,MEMBER,,,,"The recent distributed update (1.20.2) broke a test: https://github.com/pydata/xarray/blob/master/xarray/tests/test_distributed.py#L57-L84 It fails with: ``` > assert s.task_state E AttributeError: 'Scheduler' object has no attribute 'task_state' ```
``` __________________________________ test_async __________________________________ def test_func(): # Restore default logging levels # XXX use pytest hooks/fixtures instead? for name, level in logging_levels.items(): logging.getLogger(name).setLevel(level) old_globals = _globals.copy() result = None workers = [] with pristine_loop() as loop: with check_active_rpc(loop, active_rpc_timeout): @gen.coroutine def coro(): for i in range(5): try: s, ws = yield start_cluster( ncores, scheduler, loop, security=security, Worker=Worker, scheduler_kwargs=scheduler_kwargs, worker_kwargs=worker_kwargs) except Exception: logger.error(""Failed to start gen_cluster, retryng"") else: break workers[:] = ws args = [s] + workers if client: c = yield Client(s.address, loop=loop, security=security, asynchronous=True) args = [c] + args try: result = yield func(*args) if s.validate: s.validate_state() finally: if client: yield c._close() yield end_cluster(s, workers) _globals.clear() _globals.update(old_globals) raise gen.Return(result) > result = loop.run_sync(coro, timeout=timeout) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/distributed/utils_test.py:749: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/ioloop.py:458: in run_sync return future_cell[0].result() ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/concurrent.py:238: in result raise_exc_info(self._exc_info) :4: in raise_exc_info ??? ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1069: in run yielded = self.gen.send(value) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/distributed/utils_test.py:737: in coro result = yield func(*args) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1055: in run value = future.result() ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/concurrent.py:238: in result raise_exc_info(self._exc_info) :4: in raise_exc_info ??? ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1069: in run yielded = self.gen.send(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ c = s = a = b = @pytest.mark.skipif(distributed.__version__ <= '1.19.3', reason='Need recent distributed version to clean up get') @gen_cluster(client=True, timeout=None) def test_async(c, s, a, b): x = create_test_data() assert not dask.is_dask_collection(x) y = x.chunk({'dim2': 4}) + 10 assert dask.is_dask_collection(y) assert dask.is_dask_collection(y.var1) assert dask.is_dask_collection(y.var2) z = y.persist() assert str(z) assert dask.is_dask_collection(z) assert dask.is_dask_collection(z.var1) assert dask.is_dask_collection(z.var2) assert len(y.__dask_graph__()) > len(z.__dask_graph__()) assert not futures_of(y) assert futures_of(z) future = c.compute(z) w = yield future assert not dask.is_dask_collection(w) assert_allclose(x + 10, w) > assert s.task_state E AttributeError: 'Scheduler' object has no attribute 'task_state' ```
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1903/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 289976092,MDU6SXNzdWUyODk5NzYwOTI=,1843,Refactor/modernize the rasterio backend test suite ,10050469,closed,0,10050469,,2,2018-01-19T13:30:02Z,2018-02-07T08:40:34Z,2018-02-07T08:40:34Z,MEMBER,,,,Once https://github.com/pydata/xarray/pull/1817 and https://github.com/pydata/xarray/pull/1712 are merged it might be a good idea to revisit the tests to remove boilerplate code and try to generalize them. ,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1843/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 294777470,MDExOlB1bGxSZXF1ZXN0MTY3NDI3NjA5,1890,Simplify some rasterio tests,10050469,closed,0,,,0,2018-02-06T14:24:34Z,2018-02-07T08:40:34Z,2018-02-07T08:40:34Z,MEMBER,,0,pydata/xarray/pulls/1890," - [x] Closes https://github.com/pydata/xarray/issues/1843 - [x] Tests added - [x] Tests passed - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API This PR restores the tests that were incorrectly removed in https://github.com/pydata/xarray/pull/1817 and adds the what's new entry I forgot in https://github.com/pydata/xarray/pull/1712 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1890/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 294779019,MDExOlB1bGxSZXF1ZXN0MTY3NDI4Nzgx,1891,Use pip install -e in contributing docs,10050469,closed,0,,,0,2018-02-06T14:28:52Z,2018-02-06T19:59:04Z,2018-02-06T19:59:04Z,MEMBER,,0,pydata/xarray/pulls/1891,"I think that pip install is the recommended way to install packages in dev mode (see also http://www.python3statement.org/practicalities/ ) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1891/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 293869118,MDExOlB1bGxSZXF1ZXN0MTY2Nzg3ODg4,1881,Fix rasterio example in docs,10050469,closed,0,,,3,2018-02-02T12:13:33Z,2018-02-02T19:11:01Z,2018-02-02T19:10:54Z,MEMBER,,0,pydata/xarray/pulls/1881,"https://github.com/pydata/xarray/pull/1796 introduced a bug in the doc gallery. This PR reverts the code to the previous greyscale example and adds a new case using imshow (the use case is different, as I tried to explain in the descriptions). I also took care of https://github.com/pydata/xarray/issues/1789#issuecomment-356068358 : the docs should now build even when rasterio is not installed. cc @Zac-HD , @shoyer ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1881/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 273268690,MDExOlB1bGxSZXF1ZXN0MTUyMTI1Njky,1712,Use rasterio's transform instead of homemade coordinates ,10050469,closed,0,,,10,2017-11-12T21:52:04Z,2018-01-26T13:51:15Z,2018-01-26T13:50:54Z,MEMBER,,0,pydata/xarray/pulls/1712," - [x] Closes https://github.com/pydata/xarray/issues/1686 - [x] Tests added / passed - [x] Passes ``git diff upstream/master **/*py | flake8 --diff`` - [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 285706946,MDExOlB1bGxSZXF1ZXN0MTYwOTI1NDE0,1808,Add gallery example for multiple lines plot,10050469,closed,0,,,1,2018-01-03T14:45:48Z,2018-01-03T18:34:40Z,2018-01-03T18:34:40Z,MEMBER,,0,pydata/xarray/pulls/1808,"Extends https://github.com/pydata/xarray/pull/1804 by adding some usage examples to the gallery. Here is how the plot looks like: ![sphx_glr_plot_lines_from_2d_001](https://user-images.githubusercontent.com/10050469/34524787-12d5eb82-f09d-11e7-9bb0-62070a62f6e6.png) cc @dcherian and @shoyer ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1808/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 186751743,MDU6SXNzdWUxODY3NTE3NDM=,1073,Dataset.concat() doesn't preserve coordinates-variables order,10050469,closed,0,,,1,2016-11-02T09:37:01Z,2017-11-14T21:02:32Z,2017-11-14T21:02:31Z,MEMBER,,,,"Follow-up to https://github.com/pydata/xarray/pull/1049 Example: ```python import xarray as xr import numpy as np ds = xr.Dataset() for vn in ['a', 'b', 'c']: ds[vn] = xr.DataArray(np.arange(10), dims=['t']) dsg = ds.groupby('t').mean() print(list(ds.variables.keys())) out : ['t', 'a', 'b', 'c'] print(list(dsg.variables.keys())) out: ['a', 'b', 'c', 't'] ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 181881219,MDU6SXNzdWUxODE4ODEyMTk=,1042,Dataset.groupby() doesn't preserve variables order,10050469,closed,0,,,8,2016-10-09T11:09:11Z,2017-11-14T20:24:50Z,2016-11-02T09:34:46Z,MEMBER,,,,"Is it intentional? I think it is rather undesirable, but maybe there is some reason for this. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 271599372,MDU6SXNzdWUyNzE1OTkzNzI=,1694,Regression: dropna() on lazy variable,10050469,closed,0,,2415632,10,2017-11-06T19:53:18Z,2017-11-08T13:49:01Z,2017-11-08T13:36:09Z,MEMBER,,,,"#### Code Sample, a copy-pastable example if possible ```python import numpy as np import xarray as xr a = np.random.randn(4, 3) a[1, 1] = np.NaN da = xr.DataArray(a, dims=('y', 'x'), coords={'y':np.arange(4), 'x':np.arange(3)}) da.to_netcdf('test.nc') with xr.open_dataarray('test.nc') as da: da.dropna(dim='x', how='any') --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () 8 9 with xr.open_dataarray('test.nc') as da: ---> 10 da.dropna(dim='x', how='any') ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in dropna(self, dim, how, thresh) 1158 DataArray 1159 """""" -> 1160 ds = self._to_temp_dataset().dropna(dim, how=how, thresh=thresh) 1161 return self._from_temp_dataset(ds) 1162 ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in dropna(self, dim, how, thresh, subset) 2292 raise TypeError('must specify how or thresh') 2293 -> 2294 return self.isel(**{dim: mask}) 2295 2296 def fillna(self, value): ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, **indexers) 1291 coord_names = set(variables).intersection(self._coord_names) 1292 selected = self._replace_vars_and_dims(variables, -> 1293 coord_names=coord_names) 1294 1295 # Extract coordinates from indexers ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in _replace_vars_and_dims(self, variables, coord_names, dims, attrs, inplace) 598 """""" 599 if dims is None: --> 600 dims = calculate_dimensions(variables) 601 if inplace: 602 self._dims = dims ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in calculate_dimensions(variables) 111 raise ValueError('conflicting sizes for dimension %r: ' 112 'length %s on %r and length %s on %r' % --> 113 (dim, size, k, dims[dim], last_used[dim])) 114 return dims 115 ValueError: conflicting sizes for dimension 'y': length 2 on and length 4 on 'y' ``` #### Problem description See above. Note that the code runs when: - data is previously read into memory with `load()` - the `DataArray` is stored without coordinates (this is strange) - `dropna` is applied to `'y'` instead of `'x'` #### Expected Output This used to work in v0.9.6 #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.5.2.final.0 python-bits: 64 OS: Linux OS-release: 4.10.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0rc1-5-g2a1d392 pandas: 0.21.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.0 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.4 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 36.6.0 pip: 9.0.1 conda: None pytest: 3.2.3 IPython: 6.2.1 sphinx: 1.6.5
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1694/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 271036342,MDU6SXNzdWUyNzEwMzYzNDI=,1688,NotImplementedError: Vectorized indexing for is not implemented.,10050469,closed,0,,2415632,1,2017-11-03T16:21:26Z,2017-11-07T20:41:44Z,2017-11-07T20:41:44Z,MEMBER,,,,"I think this is a regression in the current 0.10.0rc1: #### Code Sample ```python import xarray as xr ds = xr.open_dataset('cesm_data.nc', decode_cf=False) ds.temp.isel(time=ds.time < 274383) # throws an error --------------------------------------------------------------------------- NotImplementedError Traceback (most recent call last) in () ----> 1 ds.temp.isel(time=ds.time < 274383) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in isel(self, drop, **indexers) 717 DataArray.sel 718 """""" --> 719 ds = self._to_temp_dataset().isel(drop=drop, **indexers) 720 return self._from_temp_dataset(ds) 721 ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, **indexers) 1278 for name, var in iteritems(self._variables): 1279 var_indexers = {k: v for k, v in indexers_list if k in var.dims} -> 1280 new_var = var.isel(**var_indexers) 1281 if not (drop and name in var_indexers): 1282 variables[name] = new_var ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/variable.py in isel(self, **indexers) 771 if dim in indexers: 772 key[i] = indexers[dim] --> 773 return self[tuple(key)] 774 775 def squeeze(self, dim=None): ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/variable.py in __getitem__(self, key) 595 """""" 596 dims, index_tuple, new_order = self._broadcast_indexes(key) --> 597 data = self._indexable_data[index_tuple] 598 if new_order: 599 data = np.moveaxis(data, range(len(new_order)), new_order) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in __getitem__(self, key) 414 415 def __getitem__(self, key): --> 416 return type(self)(_wrap_numpy_scalars(self.array[key])) 417 418 def __setitem__(self, key, value): ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in __getitem__(self, key) 394 395 def __getitem__(self, key): --> 396 return type(self)(_wrap_numpy_scalars(self.array[key])) 397 398 def __setitem__(self, key, value): ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in __getitem__(self, key) 361 362 def __getitem__(self, key): --> 363 return type(self)(self.array, self._updated_key(key)) 364 365 def __setitem__(self, key, value): ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in _updated_key(self, new_key) 336 raise NotImplementedError( 337 'Vectorized indexing for {} is not implemented. Load your ' --> 338 'data first with .load() or .compute().'.format(type(self))) 339 new_key = iter(expanded_indexer(new_key, self.ndim)) 340 key = [] NotImplementedError: Vectorized indexing for is not implemented. Load your data first with .load() or .compute(). ``` Here is the file: [cesm_data.nc.zip](https://github.com/pydata/xarray/files/1441729/cesm_data.nc.zip) #### Expected Output This used to work in v0.9 #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.5.2.final.0 python-bits: 64 OS: Linux OS-release: 4.10.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0rc1 pandas: 0.21.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.0 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.4 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 36.6.0 pip: 9.0.1 conda: None pytest: 3.2.3 IPython: 6.2.1 sphinx: 1.6.5
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 231811609,MDU6SXNzdWUyMzE4MTE2MDk=,1429,Orthogonal indexing and MemoryCachedArray,10050469,closed,0,,,5,2017-05-27T16:20:18Z,2017-11-06T17:21:56Z,2017-11-06T17:21:56Z,MEMBER,,,,"While working on https://github.com/pydata/xarray/pull/1260 I came upon this which looks like a bug in caching: ```python import numpy as np import xarray as xr from xarray.core import indexing nx, ny = 8, 10 data = np.arange(nx*ny).reshape(ny, nx) cached = indexing.MemoryCachedArray(data) data = xr.DataArray(data=data, dims=('y', 'x')) cached = xr.DataArray(data=cached, dims=('y', 'x')) a = data.isel(x=[2, 4], y=[3, 5]) b = cached.isel(x=[2, 4], y=[3, 5]) ``` The last line raises: ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) in () 11 12 a = data.isel(x=[2, 4], y=[3, 5]) ---> 13 b = cached.isel(x=[2, 4], y=[3, 5]) /home/mowglie/Documents/git/xarray/xarray/core/dataarray.py in isel(self, drop, **indexers) 668 DataArray.sel 669 """""" --> 670 ds = self._to_temp_dataset().isel(drop=drop, **indexers) 671 return self._from_temp_dataset(ds) 672 /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in isel(self, drop, **indexers) 1141 for name, var in iteritems(self._variables): 1142 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1143 new_var = var.isel(**var_indexers) 1144 if not (drop and name in var_indexers): 1145 variables[name] = new_var /home/mowglie/Documents/git/xarray/xarray/core/variable.py in isel(self, **indexers) 547 if dim in indexers: 548 key[i] = indexers[dim] --> 549 return self[tuple(key)] 550 551 def squeeze(self, dim=None): /home/mowglie/Documents/git/xarray/xarray/core/variable.py in __getitem__(self, key) 380 # orthogonal indexing should ensure the dimensionality is consistent 381 if hasattr(values, 'ndim'): --> 382 assert values.ndim == len(dims), (values.ndim, len(dims)) 383 else: 384 assert len(dims) == 0, len(dims) AssertionError: (1, 2) ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 201428093,MDU6SXNzdWUyMDE0MjgwOTM=,1215,to_netcdf() fails to append to an existing file,10050469,closed,0,,,14,2017-01-17T22:45:45Z,2017-10-25T05:09:10Z,2017-10-25T05:09:10Z,MEMBER,,,,"The following code used to work well in v0.8.2: ```python import os import xarray as xr path = 'test.nc' if os.path.exists(path): os.remove(path) ds = xr.Dataset() ds['dim'] = ('dim', [0, 1, 2]) ds['var1'] = ('dim', [10, 11, 12]) ds.to_netcdf(path) ds = xr.Dataset() ds['dim'] = ('dim', [0, 1, 2]) ds['var2'] = ('dim', [10, 11, 12]) ds.to_netcdf(path, 'a') ``` On master, it fails with: ``` --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) in () 14 ds['dim'] = ('dim', [0, 1, 2]) 15 ds['var2'] = ('dim', [10, 11, 12]) ---> 16 ds.to_netcdf(path, 'a') /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding) 927 from ..backends.api import to_netcdf 928 return to_netcdf(self, path, mode, format=format, group=group, --> 929 engine=engine, encoding=encoding) 930 931 def __unicode__(self): /home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path, mode, format, group, engine, writer, encoding) 563 store = store_cls(path, mode, format, group, writer) 564 try: --> 565 dataset.dump_to_store(store, sync=sync, encoding=encoding) 566 if isinstance(path, BytesIO): 567 return path.getvalue() /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding) 873 variables, attrs = encoder(variables, attrs) 874 --> 875 store.store(variables, attrs, check_encoding) 876 if sync: 877 store.sync() /home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set) 219 cf_variables, cf_attrs = cf_encoder(variables, attributes) 220 AbstractWritableDataStore.store(self, cf_variables, cf_attrs, --> 221 check_encoding_set) 222 223 /home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set) 194 def store(self, variables, attributes, check_encoding_set=frozenset()): 195 self.set_attributes(attributes) --> 196 self.set_variables(variables, check_encoding_set) 197 198 def set_attributes(self, attributes): /home/mowglie/Documents/git/xarray/xarray/backends/common.py in set_variables(self, variables, check_encoding_set) 204 name = _encode_variable_name(vn) 205 check = vn in check_encoding_set --> 206 target, source = self.prepare_variable(name, v, check) 207 self.writer.add(source, target) 208 /home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in prepare_variable(self, name, variable, check_encoding) 293 endian='native', 294 least_significant_digit=encoding.get('least_significant_digit'), --> 295 fill_value=fill_value) 296 nc4_var.set_auto_maskandscale(False) 297 netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.createVariable (netCDF4/_netCDF4.c:18740)() netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.__init__ (netCDF4/_netCDF4.c:30713)() RuntimeError: NetCDF: String match to name in use ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1215/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 264209937,MDU6SXNzdWUyNjQyMDk5Mzc=,1620,repr of class methods,10050469,closed,0,,,5,2017-10-10T12:32:10Z,2017-10-12T09:02:55Z,2017-10-12T09:02:55Z,MEMBER,,,,"Some live news from the classroom. A student (who is learning python and xarray at the same time) wanted to compute the minimum of an array and forgot the parenthesis (quite a common mistake). The printout in the notebook in that case is: ```python >>> ds.toa_sw_all_mon.min .wrapped_func of [777600 values with dtype=float32] Coordinates: * month (month) float32 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 * lat (lat) float32 -89.5 -88.5 -87.5 -86.5 -85.5 -84.5 -83.5 -82.5 ... * lon (lon) float32 -179.5 -178.5 -177.5 -176.5 -175.5 -174.5 -173.5 ... Attributes: long_name: Top of The Atmosphere Shortwave Flux, Monthly Means, All-... standard_name: TOA Shortwave Flux - All-Sky CF_name: toa_outgoing_shortwave_flux IPCC_name: none units: W m-2 valid_min: 0.00000 valid_max: 600.000> ``` which, I had to agree, is hard to identify as being a function. It's a detail, but is it necessary to print the entire repr of the array/dataset in the function repr?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 158902764,MDExOlB1bGxSZXF1ZXN0NzI4NjAzMDI=,872,ENH: more control on colorbar,10050469,closed,0,,,10,2016-06-07T11:44:52Z,2017-08-10T15:49:55Z,2016-06-09T15:52:46Z,MEMBER,,0,pydata/xarray/pulls/872,"Addresses https://github.com/pydata/xarray/issues/752 and allows to pass kwargs to colorbar. For example, it is now possible to do: ``` python import numpy as np import matplotlib.pyplot as plt import xarray as xr x, y = np.meshgrid(np.arange(12), np.arange(12)) z = xr.DataArray(np.sqrt(x**2 + y**2)) ds = z.to_dataset(name='z') fig, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize=(12, 12)) ds.z.plot.contourf(ax=ax1) ds.z.plot.contourf(ax=ax2, cbar_kwargs={'orientation':'horizontal', 'label':'MyLabel'}) ds.z.plot.contourf(ax=ax3, cbar_ax=ax4, cbar_kwargs={'orientation':'horizontal', 'label':'Funny Cbar', 'drawedges':True}) plt.tight_layout() plt.show() ``` ![test_cbar](https://cloud.githubusercontent.com/assets/10050469/15856312/f6eb5a04-2cb4-11e6-81b7-a5420b5c8653.png) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 248247382,MDExOlB1bGxSZXF1ZXN0MTM0MzI2NTA4,1502,Fix rasterio builds in docs,10050469,closed,0,,,1,2017-08-06T13:30:39Z,2017-08-07T09:21:39Z,2017-08-07T09:21:39Z,MEMBER,,0,pydata/xarray/pulls/1502," - [ ] Closes #xxxx - [ ] Tests added / passed - [ ] Passes ``git diff upstream/master | flake8 --diff`` - [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API The pinned version of rasterio we use for the doc builds isn't available anymore: https://github.com/conda-forge/rasterio-feedstock/pull/36 Tests: - builds: https://readthedocs.org/projects/xray/builds/5796746/ - displays correctly: http://xarray.pydata.org/en/fix-docs/auto_gallery/plot_rasterio.html#sphx-glr-auto-gallery-plot-rasterio-py ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 234242845,MDExOlB1bGxSZXF1ZXN0MTI0NDQ0ODUx,1445,DOC: add rasterio to build environment,10050469,closed,0,,,0,2017-06-07T15:08:05Z,2017-06-07T15:11:08Z,2017-06-07T15:11:07Z,MEMBER,,0,pydata/xarray/pulls/1445,This is needed for the doc recipe to run,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 233849342,MDExOlB1bGxSZXF1ZXN0MTI0MTU5MzM0,1443,DOC: add salem to the list of projects extending xarray,10050469,closed,0,,,0,2017-06-06T10:37:50Z,2017-06-07T11:59:52Z,2017-06-07T11:57:35Z,MEMBER,,0,pydata/xarray/pulls/1443,[Salem](https://github.com/fmaussion/salem) is likely to remain a small project but it has a more-than-one user base and might provide some inspiration for the future [pangeo](https://pangeo-data.github.io) projects.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1443/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 206905158,MDExOlB1bGxSZXF1ZXN0MTA1NzA0NzU3,1260,Add RasterIO backend,10050469,closed,0,,,38,2017-02-10T20:59:31Z,2017-06-06T16:44:43Z,2017-06-06T10:25:22Z,MEMBER,,0,pydata/xarray/pulls/1260,"Follow-up to https://github.com/pydata/xarray/pull/1070 This is my first backend so this is going to be a bit more work than I expected, but with your help we should be able to get through this. A long todo list: - [x] closes https://github.com/pydata/xarray/issues/790 - [x] add tests - [x] passes ``git diff upstream/master | flake8 --diff`` - [x] whatsnew entry - [x] wrap __getitem__ for lazy data read - [x] ~~make coordinates variables lazy too~~ ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 232222633,MDExOlB1bGxSZXF1ZXN0MTIzMDI3OTEw,1432,Add dask specific kwargs to DataArray.chunk(),10050469,closed,0,,,0,2017-05-30T11:30:27Z,2017-05-30T18:04:02Z,2017-05-30T16:50:02Z,MEMBER,,0,pydata/xarray/pulls/1432," - [ ] Closes #xxxx - [ ] Tests added / passed - [x] Passes ``git diff upstream/master | flake8 --diff`` - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API this is needed for the Rasterio PR (https://github.com/pydata/xarray/pull/1260#discussion_r118824181) There was no test for these functionalities in the xarray test suite for ``Dataset`` either, so I just added a dummy test for one of the keywords. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 230219330,MDExOlB1bGxSZXF1ZXN0MTIxNjU5OTY0,1419,Add sphinx-gallery to the docs,10050469,closed,0,,,6,2017-05-21T13:00:44Z,2017-05-24T15:20:41Z,2017-05-24T15:20:41Z,MEMBER,,0,pydata/xarray/pulls/1419," - [x] Closes https://github.com/pydata/xarray/issues/1061 - [ ] Tests added / passed - [ ] Passes ``git diff upstream/master | flake8 --diff`` - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API This uses [sphinx-gallery](http://sphinx-gallery.readthedocs.io) to illustrate some examples of the xarray workflow. See it rendered [here](http://xarray.pydata.org/en/fix-docs/auto_gallery/index.html) It's nicer for the gallery if there is a plot to draw in the end but it doesn't *have* to be a plot. Once the gallery gets more examples it's possible to sort them by topic (see the sphinx gallery documentation for an overview of the possibilities). I could think of several other things we could add (including moving or copying @jhamman and @rabernat 's examples to the gallery), but I thought it's better to merge quickly in order to encourage the community to contribute with more examples before the next release. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1419/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 185181822,MDExOlB1bGxSZXF1ZXN0OTA4NTM1ODU=,1060,Remove obsolete NetCDF4 Error catch,10050469,closed,0,,,7,2016-10-25T17:25:33Z,2017-05-22T08:58:21Z,2017-05-22T08:58:21Z,MEMBER,,0,pydata/xarray/pulls/1060,"[edit: this original PR was replaced with another change: ""remove obsolete NetCDF4 Error catch""] This allows to give a customized NetCDF4 object to the DataStore. I needed this for Salem's diagnostic variables: https://github.com/fmaussion/salem/blob/master/salem/sio.py#L737-L751 My workaround (a very [shallow](https://github.com/fmaussion/salem/blob/master/salem/sio.py#L653) subclass of NetCDF4DataStore) is fine for me, so I won't be offended if you decide not to merge. On a related issue, the IndexError catch which happens [here](https://github.com/pydata/xarray/blob/master/xarray/backends/netCDF4_.py#L52) is also giving me trouble, since it hides bugs in my code. I know I'm kind of misusing the backend, but I find it very useful the way it is. I'm open to any suggestion to make it more elegant. Thanks! ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 230211583,MDExOlB1bGxSZXF1ZXN0MTIxNjU1NzE3,1417,Update weather data example,10050469,closed,0,,,1,2017-05-21T10:31:06Z,2017-05-21T19:03:04Z,2017-05-21T10:33:08Z,MEMBER,,0,pydata/xarray/pulls/1417," - [ ] Closes #xxxx - [ ] Tests added / passed - [ ] Passes ``git diff upstream/master | flake8 --diff`` - [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API For some reason a plot from the examples was rendered statically instead of live. This is now fixed, together with an update of pandas which prevents reshaping operations on indexes. Tested [here](http://xarray.pydata.org/en/fix-docs/examples/weather-data.html)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 228821088,MDU6SXNzdWUyMjg4MjEwODg=,1409,Importing h5py corrupts xarray's IO,10050469,closed,0,,,3,2017-05-15T19:37:13Z,2017-05-21T09:31:49Z,2017-05-21T09:31:49Z,MEMBER,,,,"Not sure if this is an xarray issue, a netCDF4 or a h5py one, but I found that importing h5py is not a good idea if you want to write to netcdf4 afterwards: ```python import h5py import xarray as xr ds = xr.Dataset({'x': [1, 2, 3]}) ds.to_netcdf('test.nc4') ``` Errors with: ``` --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) in () 2 import xarray as xr 3 ds = xr.Dataset({'x': [1, 2, 3]}) ----> 4 ds.to_netcdf('test.nc4') /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims) 974 return to_netcdf(self, path, mode, format=format, group=group, 975 engine=engine, encoding=encoding, --> 976 unlimited_dims=unlimited_dims) 977 978 def __unicode__(self): /home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims) 581 try: 582 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 583 unlimited_dims=unlimited_dims) 584 if path_or_file is None: 585 return target.getvalue() /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims) 913 914 store.store(variables, attrs, check_encoding, --> 915 unlimited_dims=unlimited_dims) 916 if sync: 917 store.sync() /home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, *args, **kwargs) 244 cf_variables, cf_attrs = cf_encoder(variables, attributes) 245 AbstractWritableDataStore.store(self, cf_variables, cf_attrs, --> 246 *args, **kwargs) 247 248 /home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, unlimited_dims) 213 self.set_attributes(attributes) 214 self.set_variables(variables, check_encoding_set, --> 215 unlimited_dims=unlimited_dims) 216 217 def set_attributes(self, attributes): /home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in set_variables(self, *args, **kwargs) 286 def set_variables(self, *args, **kwargs): 287 with self.ensure_open(autoclose=False): --> 288 super(NetCDF4DataStore, self).set_variables(*args, **kwargs) 289 290 def prepare_variable(self, name, variable, check_encoding=False, /usr/lib/python3.5/contextlib.py in __exit__(self, type, value, traceback) 75 value = type() 76 try: ---> 77 self.gen.throw(type, value, traceback) 78 raise RuntimeError(""generator didn't stop after throw()"") 79 except StopIteration as exc: /home/mowglie/Documents/git/xarray/xarray/backends/common.py in ensure_open(self, autoclose) 282 self.close() 283 else: --> 284 yield 285 286 def assert_open(self): /home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in set_variables(self, *args, **kwargs) 286 def set_variables(self, *args, **kwargs): 287 with self.ensure_open(autoclose=False): --> 288 super(NetCDF4DataStore, self).set_variables(*args, **kwargs) 289 290 def prepare_variable(self, name, variable, check_encoding=False, /home/mowglie/Documents/git/xarray/xarray/backends/common.py in set_variables(self, variables, check_encoding_set, unlimited_dims) 226 target, source = self.prepare_variable( 227 name, v, check, unlimited_dims=unlimited_dims) --> 228 self.writer.add(source, target) 229 230 def set_necessary_dimensions(self, variable, unlimited_dims=None): /home/mowglie/Documents/git/xarray/xarray/backends/common.py in add(self, source, target) 167 else: 168 try: --> 169 target[...] = source 170 except TypeError: 171 # workaround for GH: scipy/scipy#6880 netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.__setitem__ (netCDF4/_netCDF4.c:48315)() netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable._put (netCDF4/_netCDF4.c:49808)() RuntimeError: NetCDF: HDF error ``` Note that using ``engine='scipy'`` or omitting to import ``h5py`` doesn't throw an error of course. For the record: - h5py: 2.7.0 - xarray: 0.9.5-5-gfd6e36e - system: linux ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 227974548,MDExOlB1bGxSZXF1ZXN0MTIwMDk0MDk4,1404,Fix pandas.tslib deprecation warning,10050469,closed,0,,,0,2017-05-11T12:29:13Z,2017-05-11T16:46:23Z,2017-05-11T16:46:23Z,MEMBER,,0,pydata/xarray/pulls/1404," - [ ] closes #xxxx - [ ] tests added / passed - [ ] passes ``git diff upstream/master | flake8 --diff`` - [ ] whatsnew entry Since the last pandas update I get: ``` /home/mowglie/Documents/git/xarray/xarray/core/formatting.py:16: FutureWarning: The pandas.tslib module is deprecated and will be removed in a future version. from pandas.tslib import OutOfBoundsDatetime ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1404/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 227807405,MDExOlB1bGxSZXF1ZXN0MTE5OTc2Mjk4,1402,Adds bottleneck to the test suite,10050469,closed,0,,,1,2017-05-10T20:54:04Z,2017-05-10T23:11:06Z,2017-05-10T23:10:58Z,MEMBER,,0,pydata/xarray/pulls/1402," - [x] closes https://github.com/pydata/xarray/issues/1401 - [ ] tests added / passed - [ ] passes ``git diff upstream/master | flake8 --diff`` - [ ] whatsnew entry I don't know if we need to install all packages to cover all the tests which rely on bottleneck, but this was the easiest. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1402/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 222143177,MDExOlB1bGxSZXF1ZXN0MTE2MTQxODc0,1377,Remove debug print(),10050469,closed,0,,,1,2017-04-17T14:29:42Z,2017-04-17T17:45:46Z,2017-04-17T17:45:44Z,MEMBER,,0,pydata/xarray/pulls/1377,This was introduced by https://github.com/pydata/xarray/pull/1368 (@shoyer) ,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1377/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 211888882,MDU6SXNzdWUyMTE4ODg4ODI=,1295,Terminology for the various coordinates,10050469,closed,0,,,8,2017-03-04T16:12:53Z,2017-03-15T16:28:12Z,2017-03-15T16:28:12Z,MEMBER,,,,"Picking up a thread about the ``repr`` (https://github.com/pydata/xarray/issues/1199#issuecomment-272824929), I think it would be good to give a name to the two different types of coordinates in xarray. Currently the doc says: > One dimensional coordinates with a name equal to their sole dimension (marked by * when printing a dataset or data array) take on a special meaning in xarray. They are used for label based indexing and alignment, like the index found on a pandas DataFrame or Series. Indeed, these “dimension” coordinates use a pandas.Index internally to store their values. > Other than for indexing, xarray does not make any direct use of the values associated with coordinates. Coordinates with names not matching a dimension are not used for alignment or indexing, nor are they required to match when doing arithmetic (see Coordinates). The use of quotation marks in ``“dimension” coordinates`` makes the term imprecise. Should we simply call the former ``dimension coordinates`` and the latter ``optional coordinates``? This would also help to uniformize error reporting (e.g. https://github.com/pydata/xarray/pull/1291#discussion_r104261803)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 211943192,MDExOlB1bGxSZXF1ZXN0MTA5MTQyODc1,1296,Clearer terminology for coordinate variables,10050469,closed,0,,,5,2017-03-05T10:08:41Z,2017-03-15T16:28:12Z,2017-03-15T16:28:12Z,MEMBER,,0,pydata/xarray/pulls/1296," - [x] closes https://github.com/pydata/xarray/issues/1295 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1296/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 209082646,MDU6SXNzdWUyMDkwODI2NDY=,1280,Current doc builds are broken,10050469,closed,0,10050469,,2,2017-02-21T09:12:52Z,2017-03-08T10:42:21Z,2017-03-08T10:42:21Z,MEMBER,,,,"This is a RTD problem so out of our control, but I'll leave this issue opened until it is resolved. See RTD issue: https://github.com/rtfd/readthedocs.org/issues/2651","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 200125945,MDU6SXNzdWUyMDAxMjU5NDU=,1199,Document the new __repr__,10050469,closed,0,,2244472,23,2017-01-11T15:37:37Z,2017-01-30T17:41:34Z,2017-01-30T17:41:34Z,MEMBER,,,,"Sorry I missed that one when it was decided upon in https://github.com/pydata/xarray/pull/1017, but I think the changes in ``repr`` should be documented somewhere (at the minimum in the ""Breaking Changes"" section of what's new). I just updated Salem for it to work well with xarray 0.9.0. The changes I had to make where quite small (that's a good thing), but it took me a bit of time to understand what was going on. What I found confusing is following: ```python In [1]: import xarray as xr In [2]: ds = xr.DataArray([1, 2, 3]).to_dataset(name='var') In [3]: ds Out[3]: Dimensions: (dim_0: 3) Coordinates: o dim_0 (dim_0) - Data variables: var (dim_0) int64 1 2 3 In [4]: 'dim_0' in ds.coords Out[4]: False ``` ``dim_0``is listed as coordinate, but ``'dim_0' in ds.coords`` is ``False``. I think it should remain like this, but maybe we should document somewhere what the ""o"" and ""*"" mean? (possibly [here](http://xarray.pydata.org/en/latest/data-structures.html#creating-a-dataset)) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1199/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 203906493,MDExOlB1bGxSZXF1ZXN0MTAzNjY2MDUy,1235,Use LooseVersion for bottleneck checks,10050469,closed,0,,,0,2017-01-29T23:54:16Z,2017-01-30T00:10:40Z,2017-01-30T00:10:40Z,MEMBER,,0,pydata/xarray/pulls/1235," - [x] closes https://github.com/pydata/xarray/issues/1208 - [ ] tests added / passed - [x] passes ``git diff upstream/master | flake8 --diff`` - [ ] whatsnew entry ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 202966756,MDExOlB1bGxSZXF1ZXN0MTAzMDIzMTEz,1227,mpl v2 for the docs,10050469,closed,0,,,0,2017-01-24T23:04:55Z,2017-01-25T00:00:55Z,2017-01-25T00:00:55Z,MEMBER,,0,pydata/xarray/pulls/1227,"For some reason I can't reach the webpage but @shoyer seems to be able to do so: https://github.com/pydata/xarray/issues/1167#issuecomment-274966778","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 202124581,MDExOlB1bGxSZXF1ZXN0MTAyNDUxMjE3,1222,New testing module and tests refactor,10050469,closed,0,,,8,2017-01-20T12:04:01Z,2017-01-24T21:18:20Z,2017-01-24T21:18:20Z,MEMBER,,0,pydata/xarray/pulls/1222," - [x] closes https://github.com/pydata/xarray/issues/1218 - [x] tests added / passed - [x] whatsnew entry ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 201884828,MDExOlB1bGxSZXF1ZXN0MTAyMjgzMTk1,1220,Fix docs,10050469,closed,0,,,5,2017-01-19T15:04:34Z,2017-01-20T17:05:54Z,2017-01-20T08:48:45Z,MEMBER,,0,pydata/xarray/pulls/1220,"This PR fixes a problem we had with ipython on RTD: https://github.com/ipython/ipython/issues/8733 We this simple workaround we can now use the latest ipython version. Thanks to @takluyver for his help!","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 201021909,MDU6SXNzdWUyMDEwMjE5MDk=,1211,__repr__ of 2D coordinates,10050469,closed,0,,,2,2017-01-16T13:41:30Z,2017-01-17T11:37:12Z,2017-01-17T11:37:12Z,MEMBER,,,,"This is a minor issue (sorry to be so picky about the repr ;) ) Small 2D coordinates are represented in a weird way: ```python In [1]: import xarray as xr In [2]: a = np.array([[1.1, 2.2, 3.3], [4.4, 5.5, 6.6]]) In [3]: da = xr.DataArray(a, dims=['y', 'x'], coords={'xy':(['y', 'x'], a)}) In [4]: da Out[4]: array([[ 1.1, 2.2, 3.3], [ 4.4, 5.5, 6.6]]) Coordinates: xy (y, x) float64 1.1 2.2 3.3 4.4 5.5 6.6 o y (y) - o x (x) - ``` This line here: ``` Coordinates: xy (y, x) float64 1.1 2.2 3.3 4.4 5.5 6.6 ``` Is a bit confusing as it flattens the coordinates. It's not a big deal though, just something to keep in mind maybe.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 200701826,MDExOlB1bGxSZXF1ZXN0MTAxNDk2NDUw,1206,RTD: fix facetted maps example,10050469,closed,0,,,1,2017-01-13T18:48:08Z,2017-01-14T09:15:54Z,2017-01-14T09:15:54Z,MEMBER,,0,pydata/xarray/pulls/1206,follow-up to https://github.com/pydata/xarray/pull/1203,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 196278181,MDExOlB1bGxSZXF1ZXN0OTg0NzU5NDU=,1171,Fix tests for upcoming matplotlib v2,10050469,closed,0,,,20,2016-12-18T14:20:19Z,2017-01-04T07:43:04Z,2017-01-04T07:43:04Z,MEMBER,,0,pydata/xarray/pulls/1171,Just to see what happens with the upcoming matplotlib version 2,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 198030158,MDExOlB1bGxSZXF1ZXN0OTk2NzE3Njk=,1191,Integer levels and vmin/vmax,10050469,closed,0,,,2,2016-12-29T16:10:22Z,2017-01-03T09:50:14Z,2017-01-03T09:50:14Z,MEMBER,,0,pydata/xarray/pulls/1191,"Follow-up to https://github.com/pydata/xarray/pull/1171#issuecomment-269556898 From the new docstring: ``` levels : int or list-like object, optional Split the colormap (cmap) into discrete color intervals. If an integer is provided, ""nice"" levels are chosen based on the data range: this can imply that the final number of levels is not exactly the expected one. Setting ``vmin`` and/or ``vmax`` with ``levels=N`` is equivalent to setting ``levels=np.linspace(vmin, vmax, N)``. ``` The logic overhead is quite simple, which is an argument in favor of this simple solution. It is consistent with mpl, as long as neither vmin or vmax are set. It _might_ change the outcome of some existing plots, though. Should I mention this in the ""Breaking changes"" section too?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188985482,MDU6SXNzdWUxODg5ODU0ODI=,1114,Converting rasm file to netCDF3 using xarray,10050469,closed,0,,,6,2016-11-13T18:25:28Z,2016-12-27T22:24:28Z,2016-12-27T22:24:28Z,MEMBER,,,,"This would help new users like https://github.com/pydata/xarray/issues/1113 and simplify the RTD build process (https://github.com/pydata/xarray/issues/1106). The problem is that it is not as trivial as expected. On the latest master: ```python import xarray as xr ds = xr.tutorial.load_dataset('rasm') ds.to_netcdf('rasm.nc', format='NETCDF3_CLASSIC', engine='scipy') ``` Throws an error: ```python --------------------------------------------------------------------------- ValueError Traceback (most recent call last) /home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path, mode, format, group, engine, writer, encoding) 516 try: --> 517 dataset.dump_to_store(store, sync=sync, encoding=encoding) 518 if isinstance(path, BytesIO): /home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding) 754 if sync: --> 755 store.sync() 756 /home/mowglie/Documents/git/xarray/xarray/backends/scipy_.py in sync(self) 149 super(ScipyDataStore, self).sync() --> 150 self.ds.flush() 151 /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in flush(self) 388 if hasattr(self, 'mode') and self.mode in 'wa': --> 389 self._write() 390 sync = flush /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write(self) 400 self._write_gatt_array() --> 401 self._write_var_array() 402 /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write_var_array(self) 448 for name in variables: --> 449 self._write_var_metadata(name) 450 # Now that we have the metadata, we know the vsize of /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write_var_metadata(self, name) 466 for dimname in var.dimensions: --> 467 dimid = self._dims.index(dimname) 468 self._pack_int(dimid) ValueError: '2' is not in list ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 196163136,MDExOlB1bGxSZXF1ZXN0OTg0MDk1ODA=,1169,DOC: small improvements to the netCDF docs,10050469,closed,0,,,1,2016-12-16T21:58:17Z,2016-12-24T11:54:51Z,2016-12-24T11:54:51Z,MEMBER,,0,pydata/xarray/pulls/1169,"Partly addresses https://github.com/pydata/xarray/issues/768#issuecomment-187226020 and https://github.com/pydata/xarray/issues/1154, mostly by making the tone less defensive. I also added that we recommend the netCDF format for IO. (I think that the format is powerful enough for being useful in other disciplines than geosciences). I agree with @rabernat that the current page title (Serialization and IO) could be changed to something more accessible, but I don't know how. Two possibilities, both quite long (it looks a bit ugly on RTD): - ""Reading and writing xarray data structures on disk"" - ""Reading/writing xarray data structures""","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 193226503,MDU6SXNzdWUxOTMyMjY1MDM=,1150,"""ncdump -h"" like repr?",10050469,closed,0,,,4,2016-12-02T21:51:36Z,2016-12-23T17:36:54Z,2016-12-23T17:36:54Z,MEMBER,,,,"Sometimes it could be useful to have a view of all variables attributes at a glance. For example, this is the repr for ERA-Interim energy fluxes data: ``` (...) Data variables: slhf (month, latitude, longitude) float64 0.02852 0.02852 0.02852 ... tsr (month, latitude, longitude) float64 -0.0001912 -0.0001912 ... strd (month, latitude, longitude) float64 166.6 166.6 166.6 166.6 ... strc (month, latitude, longitude) float64 -66.23 -66.23 -66.23 ... tisr (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... ssrd (month, latitude, longitude) float64 -0.0003951 -0.0003951 ... ssrc (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... str (month, latitude, longitude) float64 -40.65 -40.65 -40.65 ... ttr (month, latitude, longitude) float64 -171.5 -171.5 -171.5 ... tsrc (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... sshf (month, latitude, longitude) float64 10.46 10.46 10.46 10.46 ... ssr (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... ttrc (month, latitude, longitude) float64 -174.9 -174.9 -174.9 ... (...) ``` This is what my students will see when they explore the dataset for the first time. It could be nice to have a utility function (e.g. ``dumph`` or something) which would have a style closer to ``ncdump -h``: ``` (...) double ttr(month, latitude, longitude) ; ttr:least_significant_digit = 2L ; ttr:units = ""J m**-2"" ; ttr:long_name = ""Top net thermal radiation"" ; ttr:standard_name = ""toa_outgoing_longwave_flux"" ; double tsrc(month, latitude, longitude) ; tsrc:least_significant_digit = 2L ; tsrc:units = ""J m**-2"" ; tsrc:long_name = ""Top net solar radiation, clear sky"" ; double sshf(month, latitude, longitude) ; sshf:least_significant_digit = 2L ; sshf:units = ""J m**-2"" ; sshf:long_name = ""Surface sensible heat flux"" ; sshf:standard_name = ""surface_upward_sensible_heat_flux"" ; double ssr(month, latitude, longitude) ; ssr:least_significant_digit = 2L ; ssr:units = ""J m**-2"" ; ssr:long_name = ""Surface net solar radiation"" ; ssr:standard_name = ""surface_net_downward_shortwave_flux"" ; double ttrc(month, latitude, longitude) ; ttrc:least_significant_digit = 2L ; ttrc:units = ""J m**-2"" ; ttrc:long_name = ""Top net thermal radiation, clear sky"" ; (...) ``` Or is there something like this already?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1150/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 196134363,MDExOlB1bGxSZXF1ZXN0OTgzODgxNzE=,1168,"Add figsize, size, and aspect arguments to plotting methods",10050469,closed,0,,,3,2016-12-16T19:18:13Z,2016-12-18T22:43:19Z,2016-12-18T22:43:19Z,MEMBER,,0,pydata/xarray/pulls/1168,"Extends and finishes https://github.com/pydata/xarray/pull/637 I chose to keep seaborn's convention for two reasons: - it doesn't break existing code - now that ``figsize`` is also available, I expect the ``size`` and ``aspect`` kwargs to be less used in non-facetgrid plots Closes https://github.com/pydata/xarray/issues/897","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 194393404,MDExOlB1bGxSZXF1ZXN0OTcxNjM2MzU=,1160,Norm should be passed to facetgrid too,10050469,closed,0,,,0,2016-12-08T17:15:32Z,2016-12-09T09:46:58Z,2016-12-09T09:46:58Z,MEMBER,,0,pydata/xarray/pulls/1160,fixes https://github.com/pydata/xarray/issues/1159,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1160/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 194322754,MDExOlB1bGxSZXF1ZXN0OTcxMTI0OTE=,1158,RTD: fix docs,10050469,closed,0,,,0,2016-12-08T12:24:09Z,2016-12-08T12:24:27Z,2016-12-08T12:24:27Z,MEMBER,,0,pydata/xarray/pulls/1158,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 194113140,MDExOlB1bGxSZXF1ZXN0OTY5Njk2MjI=,1156,RTD: furter attempt to fix savefig,10050469,closed,0,,,0,2016-12-07T17:26:07Z,2016-12-07T17:30:18Z,2016-12-07T17:30:18Z,MEMBER,,0,pydata/xarray/pulls/1156,This implements a more robust solution based on the location of ``conf.py``...,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 194095134,MDExOlB1bGxSZXF1ZXN0OTY5NTYyOTQ=,1155,RTD: fix savefig dir creation,10050469,closed,0,,,1,2016-12-07T16:21:01Z,2016-12-07T16:53:07Z,2016-12-07T16:38:03Z,MEMBER,,0,pydata/xarray/pulls/1155,"I've had a hard time finding out what was going on with missing RTD plots since we updated the packages until I came across this: https://github.com/ipython/ipython/issues/8733 The proposed fix creates the ``@savefig`` directory if not there. It works locally but I'm not sure if it will work on RTD (they might use a custom build directory). The other changes are minor tweaks to remove some build warnings and/or making use of the latest sphinx Makefile. I also removed a part of the conf.py which I think is not needed anymore. @shoyer do you remember what the ``inspect.findsource`` monkeypatch was good for?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 192685832,MDExOlB1bGxSZXF1ZXN0OTU5NzgyMjU=,1144,Update ipython on RTD,10050469,closed,0,,,1,2016-11-30T21:33:55Z,2016-11-30T22:56:24Z,2016-11-30T22:45:41Z,MEMBER,,0,pydata/xarray/pulls/1144,"Another attempt to solve a cryptic error: https://readthedocs.org/projects/xray/builds/4727274/ But this time we are getting closer! I've found a report on ipython (https://github.com/ipython/ipython/issues/8850), an I've been able to reproduce it locally and solve it by updating to ipython 5 ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 192134438,MDExOlB1bGxSZXF1ZXN0OTU1ODkwODk=,1141,Try py3 again on RTD,10050469,closed,0,,,0,2016-11-28T22:27:33Z,2016-11-28T23:38:59Z,2016-11-28T23:38:59Z,MEMBER,,0,pydata/xarray/pulls/1141,"sorry about the many PRs :-( Let's see what happens with this one","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 192083180,MDExOlB1bGxSZXF1ZXN0OTU1NTMyMDk=,1139,Unpin python on RTD,10050469,closed,0,,,2,2016-11-28T18:35:51Z,2016-11-28T22:13:58Z,2016-11-28T20:00:17Z,MEMBER,,0,pydata/xarray/pulls/1139,"Pinning python to 3.5 introduced new [problems](https://readthedocs.org/projects/xray/builds/4723418/). Despite the default python being 3 [here](https://github.com/pydata/xarray/blob/master/readthedocs.yml#L4) and also in our RTD preferences (dashboard->advanced), RTD seems to still use py2 per default... Anyway, going back to py2 is the easiest for now, I might come back to this when I have more time.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 191822204,MDExOlB1bGxSZXF1ZXN0OTUzODA0NDE=,1134,Further attempt to get netCDF4 working on RTD,10050469,closed,0,,,4,2016-11-26T19:25:45Z,2016-11-28T11:11:24Z,2016-11-26T21:48:40Z,MEMBER,,0,pydata/xarray/pulls/1134,"The idea came from here: https://github.com/Unidata/netcdf4-python/issues/574#issuecomment-235435628 I could test it on another repo, so I'm confident that this should work here too. Obviously, this is a temporary solution. hopefully fixes https://github.com/pydata/xarray/issues/1106","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 191856659,MDExOlB1bGxSZXF1ZXN0OTU0MDAwNTY=,1137,Pin package versions on RTD,10050469,closed,0,,,1,2016-11-27T11:38:11Z,2016-11-28T03:45:12Z,2016-11-28T03:45:11Z,MEMBER,,0,pydata/xarray/pulls/1137,"Now that we seem to have everything working on RTD, re-pin package versions (follow-up to https://github.com/pydata/xarray/pull/1101)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 191829594,MDU6SXNzdWUxOTE4Mjk1OTQ=,1135,"DOCS: broken ""What's new"" ",10050469,closed,0,,,3,2016-11-26T22:14:17Z,2016-11-26T23:07:36Z,2016-11-26T23:07:36Z,MEMBER,,,,"http://xarray.pydata.org/en/latest/whats-new.html See the examples at the bottom. These are all old examples relying on the ""xray"" package. We can either remove these examples (my suggestion) or update them to use array. As a general rule, I think that the what's new page shouldn't contain any code, at least not code that has to be run by RTD ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 191830253,MDExOlB1bGxSZXF1ZXN0OTUzODUyMDI=,1136,WIP: remove some refs to xray,10050469,closed,0,,,1,2016-11-26T22:30:27Z,2016-11-26T23:01:53Z,2016-11-26T23:00:50Z,MEMBER,,0,pydata/xarray/pulls/1136,"There are some uses of xray that needed to be fixed. WIP: merge after you decide what to do with https://github.com/pydata/xarray/issues/1135 Also, I'd be in favor of removing the two notebooks in https://github.com/pydata/xarray/tree/master/examples since they are also available in the docs (http://xarray.pydata.org/en/latest/examples.html). The example on RTD have the disadvantage that it's less easy to do copy/paste with them, but notebooks are more error prone since they need to be maintained externally. ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188565022,MDU6SXNzdWUxODg1NjUwMjI=,1106,Getting netCDF4 to work on RTD,10050469,closed,0,,,20,2016-11-10T17:07:35Z,2016-11-26T21:48:40Z,2016-11-26T21:48:40Z,MEMBER,,,,"This is to ping @ocefpaf on whether you have an idea on what's going on with netCDF4 on Read The Docs. See the import error [here](http://xarray.pydata.org/en/latest/examples/multidimensional-coords.html). This is our conda config file: https://github.com/pydata/xarray/blob/master/doc/environment.yml I've found related discussions [here](https://groups.google.com/forum/#!topic/pyart-users/OGg1YfnO6ZQ) or [here](https://github.com/ioos/conda-recipes/issues/771). Note that I never saw this when building salem's doc, which installs *many* more packages from conda-forge (see [conf file](https://github.com/fmaussion/salem/blob/master/docs/environment.yml)). Thanks a lot for your help! And no hurry.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 188831497,MDU6SXNzdWUxODg4MzE0OTc=,1111,Unable to decode time axis on rasm file,10050469,closed,0,,,2,2016-11-11T19:23:53Z,2016-11-13T18:04:00Z,2016-11-13T18:04:00Z,MEMBER,,,,"```python import xarray as xr import netCDF4 print(xr.__version__) # 0.8.2-50-g57facab print(netCDF4.__version__) # 1.2.4 ds = xr.tutorial.load_dataset('rasm') /home/mowglie/Documents/git/xarray/xarray/conventions.py:389: RuntimeWarning: Unable to decode time axis into full numpy.datetime64 objects, continuing using dummy netCDF4.datetime objects instead, reason: dates out of range result = decode_cf_datetime(example_value, units, calendar) ``` I'll have a closer look.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 188862291,MDExOlB1bGxSZXF1ZXN0OTM0MDY2MzM=,1112,Bug in DecodedCFDatetimeArray,10050469,closed,0,,,0,2016-11-11T22:11:22Z,2016-11-13T18:04:00Z,2016-11-13T18:04:00Z,MEMBER,,0,pydata/xarray/pulls/1112,Fixes https://github.com/pydata/xarray/issues/1111,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188818401,MDExOlB1bGxSZXF1ZXN0OTMzNzQyNjA=,1110,Add default channel to conda on RTD,10050469,closed,0,,,3,2016-11-11T18:13:25Z,2016-11-11T18:36:24Z,2016-11-11T18:36:23Z,MEMBER,,0,pydata/xarray/pulls/1110,"As discussed here: https://github.com/pydata/xarray/issues/1106 I also added a section somewhere to print the packages versions on RTD. It will be useful to pin the versions again once it works, and is useful enough to stay on the docs I think. I just didn't really knew where to put it, other ideas welcome!","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188810255,MDExOlB1bGxSZXF1ZXN0OTMzNjgyNTU=,1108,py2 compat header,10050469,closed,0,,,3,2016-11-11T17:29:05Z,2016-11-11T18:01:11Z,2016-11-11T18:01:06Z,MEMBER,,0,pydata/xarray/pulls/1108,"As discussed in https://github.com/pydata/xarray/pull/1079 I just left out a couple of nearly empty ``__init__``s I also removed a couple of unused imports","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 187208913,MDExOlB1bGxSZXF1ZXN0OTIyNTM4Nzc=,1079,New infer_intervals keyword for pcolormesh,10050469,closed,0,,,23,2016-11-03T22:35:29Z,2016-11-10T22:55:03Z,2016-11-10T22:55:03Z,MEMBER,,0,pydata/xarray/pulls/1079,"Addresses https://github.com/pydata/xarray/issues/781 @jhamman what do you think? I'm not sure if ``infer_interval_breaks`` is the best name for the keyword but I didn't come up with anything better than that. (maybe ``infer_coord_intervals``?)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1079/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188546407,MDExOlB1bGxSZXF1ZXN0OTMxNzc4NzI=,1105,Another attempt to get netCDF4 working on RTD + py3,10050469,closed,0,,,1,2016-11-10T15:59:27Z,2016-11-10T16:50:18Z,2016-11-10T16:49:42Z,MEMBER,,0,pydata/xarray/pulls/1105,Sorry @shoyer you might get a few of these debugging PRs today again.,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188350265,MDExOlB1bGxSZXF1ZXN0OTMwMzcyNzA=,1101,unpin package versions for doc build,10050469,closed,0,,,3,2016-11-09T20:55:47Z,2016-11-10T02:44:28Z,2016-11-09T21:07:01Z,MEMBER,,0,pydata/xarray/pulls/1101,,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188125831,MDExOlB1bGxSZXF1ZXN0OTI4NzU1MjY=,1098,Docs tweaks,10050469,closed,0,,,6,2016-11-08T22:50:34Z,2016-11-09T20:52:19Z,2016-11-09T17:44:49Z,MEMBER,,0,pydata/xarray/pulls/1098,"The multidimensional coords example is now built live. I tested this locally and tried to keep the look of the figures as close as possible to the original ones. CC @rabernat @shoyer (I also added a link to salem in the list of packages extending xarray - shameless self-promotion)","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1098/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 188340763,MDExOlB1bGxSZXF1ZXN0OTMwMzAzNDI=,1099,Add cartopy and netcdf4 to the doc build,10050469,closed,0,,,0,2016-11-09T20:10:25Z,2016-11-09T20:41:04Z,2016-11-09T20:41:04Z,MEMBER,,0,pydata/xarray/pulls/1099,as discussed in https://github.com/pydata/xarray/pull/1098,"{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1099/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 187200802,MDU6SXNzdWUxODcyMDA4MDI=,1078,Fascinating bug in contourf,10050469,closed,0,,,3,2016-11-03T21:54:19Z,2016-11-03T22:15:46Z,2016-11-03T22:10:10Z,MEMBER,,,,"Can someone reproduce this or is it just me? ```python import matplotlib.pyplot as plt import numpy as np import xarray as xr import cartopy.crs as ccrs nlats, nlons = (241, 480) lats = np.linspace(90, -90, nlats, dtype=np.float32) lons = np.linspace(-180, 180-0.75, nlons, dtype=np.float32) l1, l2 = np.meshgrid(lons, lats) data = xr.DataArray(l1 + l2, [('latitude', lats), ('longitude', lons)]) f = plt.figure() ax1 = plt.subplot(2, 1, 1, projection=ccrs.Robinson()) data.plot.contourf(ax=ax1, transform=ccrs.PlateCarree()); ax1.coastlines(color='grey'); ax1.gridlines(); data += 180 # this is the line causing the problem ax2 = plt.subplot(2, 1, 2, projection=ccrs.Robinson()) data.plot.contourf(ax=ax2, transform=ccrs.PlateCarree()); ax2.coastlines(color='grey'); ax2.gridlines(); plt.show() ``` Gives: ![figure_1](https://cloud.githubusercontent.com/assets/10050469/19986783/5780a530-a218-11e6-8b6f-6c1267b23472.png) ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1078/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue