home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

43 rows where repo = 13221727, type = "issue" and user = 10050469 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, comments, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 41
  • open 2

type 1

  • issue · 43 ✖

repo 1

  • xarray · 43 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1362895131 I_kwDOAMm_X85RPCEb 6996 attrs are now views, not copies fmaussion 10050469 closed 0     5 2022-09-06T08:23:59Z 2022-09-06T10:45:43Z 2022-09-06T10:45:36Z MEMBER      

What is your issue?

I'm not sure yet if this is a feature or a bug - I would tend to the latter. Apologies if this has been discussed before.

Objects originating from operations such as y = x > 2 are now sharing the same attrs, which leads to things like:

```python import numpy as np import xarray as xr xr.version '2022.6.0'

x = xr.DataArray( 0.1 * np.arange(10), dims=["lat"], coords={"lat": np.arange(10)}, name="sst", ) x.lat.attrs['long_name'] = 'latitude' x.lat.attrs {'long_name': 'latitude'}

y = x > 2 y.lat.attrs {'long_name': 'latitude'}

y.lat.attrs = {} x.lat.attrs # x is changed as well! {} ```

I think this is rather a non-intuitive behavior but I'm happy to discuss!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6996/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1051241489 I_kwDOAMm_X84-qKwR 5976 Should str.format() work on xarray scalars? fmaussion 10050469 closed 0     3 2021-11-11T18:15:59Z 2022-07-25T20:01:29Z 2022-07-25T20:01:29Z MEMBER      

Consider:

python da = xr.DataArray([1, 2, 3]) print(f'{da[0]}') print(f'{da[0]:d}')

Which outputs:

``` <xarray.DataArray ()> array(1)


TypeError Traceback (most recent call last) <ipython-input-36-9cd7dc76455b> in <module> 1 da = xr.DataArray([1, 2, 3]) 2 print(f'{da[0]}') ----> 3 print(f'{da[0]:d}')

TypeError: unsupported format string passed to DataArray.format ``` And the numpy equivalent:

python da = xr.DataArray([1, 2, 3]).data print(f'{da[0]}') print(f'{da[0]:d}') 1 1

I always found the xarray scalar output to be a bit unfriendly for beginners. In my classes very often scalars are the last output of a computation, and the fact that we can't format the relatively verbose xarray output without resulting to the .data trick is a bit confusing for students (but I agree this is a detail).

Is there a way to get print(f'{da[0]:d}') to work? Thoughts?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5976/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
256496917 MDU6SXNzdWUyNTY0OTY5MTc= 1565 Regression: time attributes on PeriodIndex fmaussion 10050469 open 0     12 2017-09-10T09:27:09Z 2021-07-20T18:33:29Z   MEMBER      

The following used to work with xarray 0.9.5 but doesn't anymore with 0.9.6 or master:

python import xarray as xr import pandas as pd import numpy as np time = pd.period_range('2000-01', '2000-12', freq='M') da = xr.DataArray(np.arange(12), dims=['time'], coords={'time':time}) da['time.month']

```

KeyError Traceback (most recent call last) ~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in _getitem_coord(self, key) 458 try: --> 459 var = self._coords[key] 460 except KeyError:

KeyError: 'time.month'

During handling of the above exception, another exception occurred:

AttributeError Traceback (most recent call last) <ipython-input-1-41829b924596> in <module>() 4 time = pd.period_range('2000-01', '2000-12', freq='M') 5 da = xr.DataArray(np.arange(12), dims=['time'], coords={'time':time}) ----> 6 da['time.month']

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in getitem(self, key) 467 def getitem(self, key): 468 if isinstance(key, basestring): --> 469 return self._getitem_coord(key) 470 else: 471 # orthogonal array indexing

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in getitem_coord(self, key) 461 dim_sizes = dict(zip(self.dims, self.shape)) 462 , key, var = _get_virtual_variable( --> 463 self._coords, key, self._level_coords, dim_sizes) 464 465 return self._replace_maybe_drop_dims(var, name=key)

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in _get_virtual_variable(variables, key, level_vars, dim_sizes) 82 data = getattr(ref_var.dt, var_name).data 83 else: ---> 84 data = getattr(ref_var, var_name).data 85 virtual_var = Variable(ref_var.dims, data) 86

AttributeError: 'IndexVariable' object has no attribute 'month' ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1565/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
840258082 MDU6SXNzdWU4NDAyNTgwODI= 5073 `lock` kwarg needs a deprecation cycle? fmaussion 10050469 closed 0     6 2021-03-24T22:39:15Z 2021-05-04T14:31:10Z 2021-05-04T14:30:09Z MEMBER      

Salem's tests on master fail because I use the lock kwarg to open_dataset, which seems to have disappeared in the backend refactoring.

Should the new open_dataset simply ignore lock, and raise a FutureWarning when used?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5073/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
771127744 MDU6SXNzdWU3NzExMjc3NDQ= 4710 open_mfdataset -> to_netcdf() randomly leading to dead workers fmaussion 10050469 closed 0     4 2020-12-18T19:42:14Z 2020-12-22T11:54:37Z 2020-12-22T11:54:37Z MEMBER      

This is: - xarray: 0.16.2 - dask: 2.30.0

I'm not sure a github issue is the right place to report this, but I'm not sure where else, so here it is.

I just had two very long weeks of debugging stalled (i.e. "dead") OGGM jobs in a cluster environment. I finally nailed it down to ds.to_netcdf(path) in this situation:

python with xr.open_mfdataset(tmp_paths, combine='nested', concat_dim='rgi_id') as ds: ds.to_netcdf(path) tmp_paths are a few netcdf files (from 2 to about 60). The combined dataset is nothing close to big (a few hundred MB at most).

Most of the time, this command works just fine. But in 30% of the cases, this would just... stop and stall. One or more of the workers would simply stop working without coming back or erroring.

What I can give as additional information: - changing ds.to_netcdf(path) to ds.load().to_netcdf(path) solves the problem - the problem became worse (i.e. more often) when the files to concatenate increased in the number of variables (the final size of the concatenated file doesn't seem to matter at all, it occurs also with files < 1 MB) - I can't reproduce the problem locally. The files are here if someone's interested, but I don't think the files are the issue here. - the files use gzip compression - On cluster, we are dealing with 64 core nodes, which do a lot of work before arriving to these two lines. We use python multiprocessing ourselves before that, create our own pool and use it, etc. But at the moment the job hits these two lines, no other job is running.

Is this is some kind of weird interaction between our own multiprocessing and dask? Is it more an IO problem that occurs only on cluster? I don't know.

I know this is a crappy bug report, but the fact that I lost a lot of time on this recently has gone on my nerves :wink: (I'm mostly angry at myself for taking so long to find out that these two lines were the problem).

In order to make a question out of this crappy report: how can I possibly debug this? I solved my problem now (with ds.load()), but this is not really satisfying. Any tip is appreciated!

cc @TimoRoth our cluster IT whom I annoyed a lot before finding out that the problem was in xarray/dask

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4710/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
184456540 MDU6SXNzdWUxODQ0NTY1NDA= 1056 groupby_bins along two dims simultaneously fmaussion 10050469 open 0     3 2016-10-21T10:50:06Z 2020-10-04T05:06:37Z   MEMBER      

I probably missed it, but what is the way to apply groupby (or rather groupby_bins) in order to achieve the following in xarray?

``` python da = xr.DataArray(np.arange(16).reshape((4, 4))) da <xarray.DataArray (dim_0: 4, dim_1: 4)> array([[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11], [12, 13, 14, 15]]) Coordinates: * dim_0 (dim_0) int64 0 1 2 3 * dim_1 (dim_1) int64 0 1 2 3

should be aggregated to (in case of summing) to obtain

dagg <xarray.DataArray (dim_0: 2, dim_1: 2)> array([[10, 18], [42, 50]]) Coordinates: * dim_1 (dim_1) int64 0 2 * dim_0 (dim_0) int64 0 2 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1056/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
177903376 MDU6SXNzdWUxNzc5MDMzNzY= 1009 Shouldn't .where() pass the attributes of DataArrays and DataSets? fmaussion 10050469 closed 0     4 2016-09-19T21:30:13Z 2020-04-05T19:11:46Z 2016-09-21T17:26:32Z MEMBER      

Everything is in the title! I think it should, if possible.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1009/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
247054695 MDU6SXNzdWUyNDcwNTQ2OTU= 1498 Update doc example for open_mfdataset fmaussion 10050469 closed 0     1 2017-08-01T12:37:58Z 2019-08-01T13:13:36Z 2019-08-01T13:13:36Z MEMBER      

The current doc shows bits of code which are now irrelevant thanks to open_mfdataset: http://xarray.pydata.org/en/stable/io.html#id7

On a related note, it would be great to document the bottlenecks in concat / dask and how to overcome them. Related to https://github.com/pydata/xarray/issues/1391, https://github.com/pydata/xarray/issues/1495, and https://github.com/pydata/xarray/issues/1379

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1498/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
230214809 MDU6SXNzdWUyMzAyMTQ4MDk= 1418 Indexing time with lists fmaussion 10050469 closed 0     3 2017-05-21T11:38:11Z 2019-06-29T01:58:33Z 2019-06-29T01:58:33Z MEMBER      

Is this a bug? Look the following example:

```python ds = xr.tutorial.load_dataset('air_temperature')

ds.sel(time='2013-01-01T00:00') # works fine [output removed]

ds.sel(time=['2013-01-01T00:00']) # errors Traceback (most recent call last): File "/home/mowglie/.pycharm-community-2017.1/helpers/pydev/_pydevd_bundle/pydevd_exec2.py", line 3, in Exec exec(exp, global_vars, local_vars) File "", line 1, in <module> File "/home/mowglie/Documents/git/xarray-official/xarray/core/dataset.py", line 1206, in sel self, indexers, method=method, tolerance=tolerance File "/home/mowglie/Documents/git/xarray-official/xarray/core/indexing.py", line 290, in remap_label_indexers dim, method, tolerance) File "/home/mowglie/Documents/git/xarray-official/xarray/core/indexing.py", line 229, in convert_label_indexer % index_name) KeyError: "not all values found in index 'time'" ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1418/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
383791450 MDU6SXNzdWUzODM3OTE0NTA= 2565 CF conventions: time_bnds and time units fmaussion 10050469 closed 0     4 2018-11-23T11:32:37Z 2019-01-08T22:28:37Z 2019-01-08T22:28:37Z MEMBER      

Problem

Here is the dump of a NetCDF file (download):

``` netcdf cesm.TREFHT.160001-200512.selection { dimensions: time = UNLIMITED ; // (4872 currently) lat = 3 ; lon = 3 ; nbnd = 2 ; variables: float TREFHT(time, lat, lon) ; TREFHT:units = "K" ; TREFHT:long_name = "Reference height temperature" ; TREFHT:cell_methods = "time: mean" ; double lat(lat) ; lat:long_name = "latitude" ; lat:units = "degrees_north" ; double lon(lon) ; lon:long_name = "longitude" ; lon:units = "degrees_east" ; double time(time) ; time:long_name = "time" ; time:units = "days since 0850-01-01 00:00:00" ; time:calendar = "noleap" ; time:bounds = "time_bnds" ; double time_bnds(time, nbnd) ; time_bnds:long_name = "time interval endpoints" ;

// global attributes: :Conventions = "CF-1.0" ; :source = "CAM" ; ... } ```

When xarray decodes the time coordinates it also deletes the time:units attribute (this kind of makes sense, because the unit has no meaning when the time is converted to a CFTime object):

python import xarray as xr ds = xr.open_dataset(f) ds.time <xarray.DataArray 'time' (time: 4872)> array([cftime.DatetimeNoLeap(1600, 2, 1, 0, 0, 0, 0, 0, 32), cftime.DatetimeNoLeap(1600, 3, 1, 0, 0, 0, 0, 0, 60), cftime.DatetimeNoLeap(1600, 4, 1, 0, 0, 0, 0, 3, 91), ..., cftime.DatetimeNoLeap(2005, 11, 1, 0, 0, 0, 0, 6, 305), cftime.DatetimeNoLeap(2005, 12, 1, 0, 0, 0, 0, 1, 335), cftime.DatetimeNoLeap(2006, 1, 1, 0, 0, 0, 0, 4, 1)], dtype=object) Coordinates: * time (time) object 1600-02-01 00:00:00 ... 2006-01-01 00:00:00 Attributes: long_name: time bounds: time_bnds

The problem is that I have no way to actually decode the time_bnds variable from xarray alone now, because the time_bnds variable doesn't store the time units. First, I thought that my file was not CF compliant but I've looked into the CF conventions and it looks like they are not prescribing that time_bnds should also have a units attribute.

Solution

I actually don't know what we should do here. I see a couple of ways: 1. we don't care and leave it to the user (here: me) to open the file with netCDF4 to decode the time bounds 2. we don't delete the time:units attribute after decoding 3. we start to also decode the time_bnds when available, like we do with time

Thoughts? cc @spencerkclark @jhamman

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2565/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
344879199 MDU6SXNzdWUzNDQ4NzkxOTk= 2316 rasterio released v1 as stable fmaussion 10050469 closed 0     1 2018-07-26T14:53:54Z 2019-01-08T22:19:32Z 2019-01-08T22:19:32Z MEMBER      

conda-forge now ships v1.0.1 per default.

After two years of betas and release candidates this is very welcome!

We have very little code specifically handling pre-v1 and post-v1, but we should keep it around for a couple more months.

Will update the CI to reflect this change.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2316/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
379177627 MDU6SXNzdWUzNzkxNzc2Mjc= 2551 HDF Errors since xarray 0.11 fmaussion 10050469 closed 0     10 2018-11-09T14:14:11Z 2018-11-12T00:12:46Z 2018-11-11T12:10:36Z MEMBER      

(EDIT: sorry for unexpected early posting)

I just wanted to open this issue here, just to see if it has some resonance in other projects.

We are getting new unexpected HDF Errors in our test suite which are definitely due to the recent xarray update (reverting to 0.10.9 solves the problem).

The error is the famous (and very informative):

[Errno -101] NetCDF: HDF error:

I have not been able to create a MWE yet, but it has something to do with read -> close -> append workflows on netcdf4 files (the error happens at the "append" step). Possibly multiprocessing also plays a role, but I can't be sure yet.

I will try to find a way to reproduce this with a simple example, but this might take a while...

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2551/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
362516560 MDU6SXNzdWUzNjI1MTY1NjA= 2428 Stickler-ci fmaussion 10050469 closed 0     2 2018-09-21T08:50:25Z 2018-10-07T22:40:08Z 2018-10-07T22:40:08Z MEMBER      

The last time stickler had a look at our PRs is 8 days ago (https://github.com/pydata/xarray/pull/2415) : https://stickler-ci.com/repositories/26661-pydata-xarray

It looks like their bot is broken: https://github.com/stickler-ci

This is not very trustworthy - we could consider switching to https://pep8speaks.com/ maybe

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2428/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
361818115 MDU6SXNzdWUzNjE4MTgxMTU= 2422 Plot2D no longer sorts coordinates before plotting fmaussion 10050469 closed 0     6 2018-09-19T16:00:56Z 2018-09-21T17:47:12Z 2018-09-21T17:47:12Z MEMBER      

I have a dataset with decreasing latitude coordinates.

With xarray v0.10.8, this is what happens when plotting:

But on latest master the image is now upside down:

Sorry if I missed a change along the way, I was off for a long time.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2422/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
343944521 MDU6SXNzdWUzNDM5NDQ1MjE= 2306 Timeouts errors on readthedocs fmaussion 10050469 closed 0     3 2018-07-24T08:55:23Z 2018-07-26T14:41:48Z 2018-07-26T14:41:48Z MEMBER      

We are reaching the 900s build time limit on readthedocs more often than not (https://readthedocs.org/projects/xray/builds/). I have the same problem with all of my OS projects.

The bottleneck is the conda environment installation, which took 457s on the latest failed build. I'm going to try to spare some time in a subsequent PR, but we might have to get in touch with the RTD people to get more build time.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2306/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
297452821 MDU6SXNzdWUyOTc0NTI4MjE= 1912 Code review bots? fmaussion 10050469 closed 0     4 2018-02-15T13:51:39Z 2018-05-01T07:24:00Z 2018-05-01T07:24:00Z MEMBER      

I'm seeing them from time to time on other repositories. One that seems reasonable and not toooo intrusive is stickler, for code style review: https://stickler-ci.com/

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1912/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
302679890 MDU6SXNzdWUzMDI2Nzk4OTA= 1966 imshow should work with third dimension of len 1 fmaussion 10050469 closed 0     2 2018-03-06T12:22:57Z 2018-03-08T23:51:45Z 2018-03-08T23:51:45Z MEMBER      

Code Sample, a copy-pastable example if possible

```python import xarray as xr import numpy as np

da = xr.DataArray(np.arange(9).reshape((1, 3, 3)))

da.plot() # works da.plot.imshow() # fails ``` Error log:

``` /home/mowglie/Documents/git/xarray/xarray/plot/utils.py:295: UserWarning: Several dimensions of this array could be colors. Xarray will use the last possible dimension ('dim_2') to match matplotlib.pyplot.imshow. You can pass names of x, y, and/or rgb dimensions to override this guess. 'and/or rgb dimensions to override this guess.' % rgb) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-7-a0592d3e8758> in <module>() ----> 1 da.plot.imshow() ~/Documents/git/xarray/xarray/plot/plot.py in plotmethod(_PlotMethods_obj, x, y, figsize, size, aspect, ax, row, col, col_wrap, xincrease, yincrease, add_colorbar, add_labels, vmin, vmax, cmap, colors, center, robust, extend, levels, infer_intervals, subplot_kws, cbar_ax, cbar_kwargs, **kwargs) 679 for arg in ['_PlotMethods_obj', 'newplotfunc', 'kwargs']: 680 del allargs[arg] --> 681 return newplotfunc(**allargs) 682 683 # Add to class _PlotMethods ~/Documents/git/xarray/xarray/plot/plot.py in newplotfunc(darray, x, y, figsize, size, aspect, ax, row, col, col_wrap, xincrease, yincrease, add_colorbar, add_labels, vmin, vmax, cmap, center, robust, extend, levels, infer_intervals, colors, subplot_kws, cbar_ax, cbar_kwargs, **kwargs) 553 rgb = kwargs.pop('rgb', None) 554 xlab, ylab = _infer_xy_labels( --> 555 darray=darray, x=x, y=y, imshow=imshow_rgb, rgb=rgb) 556 557 if rgb is not None and plotfunc.__name__ != 'imshow': ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels(darray, x, y, imshow, rgb) 308 assert x is None or x != y 309 if imshow and darray.ndim == 3: --> 310 return _infer_xy_labels_3d(darray, x, y, rgb) 311 312 if x is None and y is None: ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels_3d(darray, x, y, rgb) 297 298 # Finally, we pick out the red slice and delegate to the 2D version: --> 299 return _infer_xy_labels(darray.isel(**{rgb: 0}).squeeze(), x, y) 300 301 ~/Documents/git/xarray/xarray/plot/utils.py in _infer_xy_labels(darray, x, y, imshow, rgb) 312 if x is None and y is None: 313 if darray.ndim != 2: --> 314 raise ValueError('DataArray must be 2d') 315 y, x = darray.dims 316 elif x is None: ValueError: DataArray must be 2d ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1966/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
293858326 MDU6SXNzdWUyOTM4NTgzMjY= 1880 Should imshow() recognise 0-255 images? fmaussion 10050469 closed 0     1 2018-02-02T11:30:21Z 2018-02-12T22:12:13Z 2018-02-12T22:12:13Z MEMBER      

Code Sample, a copy-pastable example if possible

```python import os import urllib.request import xarray as xr import matplotlib.pyplot as plt

Download the file from rasterio's repository

url = 'https://github.com/mapbox/rasterio/raw/master/tests/data/RGB.byte.tif' urllib.request.urlretrieve(url, 'RGB.byte.tif')

Read the data

da = xr.open_rasterio('RGB.byte.tif')

f, (ax1, ax2) = plt.subplots(1, 2, figsize=(9, 4)) da.plot.imshow(ax=ax1) (da / 255).plot.imshow(ax=ax2) plt.tight_layout() plt.show()

Delete the file

os.remove('RGB.byte.tif') ```

Problem description

In https://github.com/pydata/xarray/pull/1796, @Zac-HD added support for RGBA images. If an alpha channel is not found, it is added (code)

The problem is that adding this alpha channel requires the images to be normalized to 0-1, while plotting an image in 0-255 range without alpha channel works fine in matplotlib. Removing https://github.com/pydata/xarray/blob/master/xarray/plot/plot.py#L708-L715 would solve the problem, but I guess it was added for a reason.

@Zac-HD , thoughts?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1880/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
296303590 MDU6SXNzdWUyOTYzMDM1OTA= 1903 Broken distributed test fmaussion 10050469 closed 0     2 2018-02-12T09:04:51Z 2018-02-12T21:08:05Z 2018-02-12T21:08:05Z MEMBER      

The recent distributed update (1.20.2) broke a test: https://github.com/pydata/xarray/blob/master/xarray/tests/test_distributed.py#L57-L84

It fails with:

```

  assert s.task_state

E AttributeError: 'Scheduler' object has no attribute 'task_state' ```

``` __________________________________ test_async __________________________________ def test_func(): # Restore default logging levels # XXX use pytest hooks/fixtures instead? for name, level in logging_levels.items(): logging.getLogger(name).setLevel(level) old_globals = _globals.copy() result = None workers = [] with pristine_loop() as loop: with check_active_rpc(loop, active_rpc_timeout): @gen.coroutine def coro(): for i in range(5): try: s, ws = yield start_cluster( ncores, scheduler, loop, security=security, Worker=Worker, scheduler_kwargs=scheduler_kwargs, worker_kwargs=worker_kwargs) except Exception: logger.error("Failed to start gen_cluster, retryng") else: break workers[:] = ws args = [s] + workers if client: c = yield Client(s.address, loop=loop, security=security, asynchronous=True) args = [c] + args try: result = yield func(*args) if s.validate: s.validate_state() finally: if client: yield c._close() yield end_cluster(s, workers) _globals.clear() _globals.update(old_globals) raise gen.Return(result) > result = loop.run_sync(coro, timeout=timeout) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/distributed/utils_test.py:749: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/ioloop.py:458: in run_sync return future_cell[0].result() ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/concurrent.py:238: in result raise_exc_info(self._exc_info) <string>:4: in raise_exc_info ??? ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1069: in run yielded = self.gen.send(value) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/distributed/utils_test.py:737: in coro result = yield func(*args) ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1055: in run value = future.result() ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/concurrent.py:238: in result raise_exc_info(self._exc_info) <string>:4: in raise_exc_info ??? ../../../../.pyvirtualenvs/py3/lib/python3.5/site-packages/tornado/gen.py:1069: in run yielded = self.gen.send(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ c = <Client: not connected> s = <Scheduler: "tcp://127.0.0.1:38907" processes: 0 cores: 0> a = <Worker: tcp://127.0.0.1:46497, closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0> b = <Worker: tcp://127.0.0.1:33287, closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0> @pytest.mark.skipif(distributed.__version__ <= '1.19.3', reason='Need recent distributed version to clean up get') @gen_cluster(client=True, timeout=None) def test_async(c, s, a, b): x = create_test_data() assert not dask.is_dask_collection(x) y = x.chunk({'dim2': 4}) + 10 assert dask.is_dask_collection(y) assert dask.is_dask_collection(y.var1) assert dask.is_dask_collection(y.var2) z = y.persist() assert str(z) assert dask.is_dask_collection(z) assert dask.is_dask_collection(z.var1) assert dask.is_dask_collection(z.var2) assert len(y.__dask_graph__()) > len(z.__dask_graph__()) assert not futures_of(y) assert futures_of(z) future = c.compute(z) w = yield future assert not dask.is_dask_collection(w) assert_allclose(x + 10, w) > assert s.task_state E AttributeError: 'Scheduler' object has no attribute 'task_state' ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1903/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
289976092 MDU6SXNzdWUyODk5NzYwOTI= 1843 Refactor/modernize the rasterio backend test suite fmaussion 10050469 closed 0 fmaussion 10050469   2 2018-01-19T13:30:02Z 2018-02-07T08:40:34Z 2018-02-07T08:40:34Z MEMBER      

Once https://github.com/pydata/xarray/pull/1817 and https://github.com/pydata/xarray/pull/1712 are merged it might be a good idea to revisit the tests to remove boilerplate code and try to generalize them.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1843/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
186751743 MDU6SXNzdWUxODY3NTE3NDM= 1073 Dataset.concat() doesn't preserve coordinates-variables order fmaussion 10050469 closed 0     1 2016-11-02T09:37:01Z 2017-11-14T21:02:32Z 2017-11-14T21:02:31Z MEMBER      

Follow-up to https://github.com/pydata/xarray/pull/1049

Example:

```python import xarray as xr import numpy as np

ds = xr.Dataset() for vn in ['a', 'b', 'c']: ds[vn] = xr.DataArray(np.arange(10), dims=['t']) dsg = ds.groupby('t').mean() print(list(ds.variables.keys())) out : ['t', 'a', 'b', 'c'] print(list(dsg.variables.keys())) out: ['a', 'b', 'c', 't'] ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1073/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
181881219 MDU6SXNzdWUxODE4ODEyMTk= 1042 Dataset.groupby() doesn't preserve variables order fmaussion 10050469 closed 0     8 2016-10-09T11:09:11Z 2017-11-14T20:24:50Z 2016-11-02T09:34:46Z MEMBER      

Is it intentional? I think it is rather undesirable, but maybe there is some reason for this.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1042/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
271599372 MDU6SXNzdWUyNzE1OTkzNzI= 1694 Regression: dropna() on lazy variable fmaussion 10050469 closed 0   0.10 2415632 10 2017-11-06T19:53:18Z 2017-11-08T13:49:01Z 2017-11-08T13:36:09Z MEMBER      

Code Sample, a copy-pastable example if possible

```python import numpy as np import xarray as xr

a = np.random.randn(4, 3) a[1, 1] = np.NaN da = xr.DataArray(a, dims=('y', 'x'), coords={'y':np.arange(4), 'x':np.arange(3)}) da.to_netcdf('test.nc')

with xr.open_dataarray('test.nc') as da: da.dropna(dim='x', how='any')


ValueError Traceback (most recent call last) <ipython-input-37-8d137cf3a813> in <module>() 8 9 with xr.open_dataarray('test.nc') as da: ---> 10 da.dropna(dim='x', how='any')

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in dropna(self, dim, how, thresh) 1158 DataArray 1159 """ -> 1160 ds = self._to_temp_dataset().dropna(dim, how=how, thresh=thresh) 1161 return self._from_temp_dataset(ds) 1162

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in dropna(self, dim, how, thresh, subset) 2292 raise TypeError('must specify how or thresh') 2293 -> 2294 return self.isel(**{dim: mask}) 2295 2296 def fillna(self, value):

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, **indexers) 1291 coord_names = set(variables).intersection(self._coord_names) 1292 selected = self._replace_vars_and_dims(variables, -> 1293 coord_names=coord_names) 1294 1295 # Extract coordinates from indexers

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in _replace_vars_and_dims(self, variables, coord_names, dims, attrs, inplace) 598 """ 599 if dims is None: --> 600 dims = calculate_dimensions(variables) 601 if inplace: 602 self._dims = dims

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in calculate_dimensions(variables) 111 raise ValueError('conflicting sizes for dimension %r: ' 112 'length %s on %r and length %s on %r' % --> 113 (dim, size, k, dims[dim], last_used[dim])) 114 return dims 115

ValueError: conflicting sizes for dimension 'y': length 2 on <this-array> and length 4 on 'y' ```

Problem description

See above. Note that the code runs when: - data is previously read into memory with load() - the DataArray is stored without coordinates (this is strange) - dropna is applied to 'y' instead of 'x'

Expected Output

This used to work in v0.9.6

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.5.2.final.0 python-bits: 64 OS: Linux OS-release: 4.10.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0rc1-5-g2a1d392 pandas: 0.21.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.0 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.4 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 36.6.0 pip: 9.0.1 conda: None pytest: 3.2.3 IPython: 6.2.1 sphinx: 1.6.5
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1694/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
271036342 MDU6SXNzdWUyNzEwMzYzNDI= 1688 NotImplementedError: Vectorized indexing for <class 'xarray.core.indexing.LazilyIndexedArray'> is not implemented. fmaussion 10050469 closed 0   0.10 2415632 1 2017-11-03T16:21:26Z 2017-11-07T20:41:44Z 2017-11-07T20:41:44Z MEMBER      

I think this is a regression in the current 0.10.0rc1:

Code Sample

```python import xarray as xr ds = xr.open_dataset('cesm_data.nc', decode_cf=False) ds.temp.isel(time=ds.time < 274383) # throws an error


NotImplementedError Traceback (most recent call last) <ipython-input-18-a5c4179cd02d> in <module>() ----> 1 ds.temp.isel(time=ds.time < 274383)

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataarray.py in isel(self, drop, indexers) 717 DataArray.sel 718 """ --> 719 ds = self._to_temp_dataset().isel(drop=drop, indexers) 720 return self._from_temp_dataset(ds) 721

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/dataset.py in isel(self, drop, indexers) 1278 for name, var in iteritems(self._variables): 1279 var_indexers = {k: v for k, v in indexers_list if k in var.dims} -> 1280 new_var = var.isel(var_indexers) 1281 if not (drop and name in var_indexers): 1282 variables[name] = new_var

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/variable.py in isel(self, **indexers) 771 if dim in indexers: 772 key[i] = indexers[dim] --> 773 return self[tuple(key)] 774 775 def squeeze(self, dim=None):

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/variable.py in getitem(self, key) 595 """ 596 dims, index_tuple, new_order = self._broadcast_indexes(key) --> 597 data = self._indexable_data[index_tuple] 598 if new_order: 599 data = np.moveaxis(data, range(len(new_order)), new_order)

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in getitem(self, key) 414 415 def getitem(self, key): --> 416 return type(self)(_wrap_numpy_scalars(self.array[key])) 417 418 def setitem(self, key, value):

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in getitem(self, key) 394 395 def getitem(self, key): --> 396 return type(self)(_wrap_numpy_scalars(self.array[key])) 397 398 def setitem(self, key, value):

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in getitem(self, key) 361 362 def getitem(self, key): --> 363 return type(self)(self.array, self._updated_key(key)) 364 365 def setitem(self, key, value):

~/.pyvirtualenvs/py3/lib/python3.5/site-packages/xarray/core/indexing.py in _updated_key(self, new_key) 336 raise NotImplementedError( 337 'Vectorized indexing for {} is not implemented. Load your ' --> 338 'data first with .load() or .compute().'.format(type(self))) 339 new_key = iter(expanded_indexer(new_key, self.ndim)) 340 key = []

NotImplementedError: Vectorized indexing for <class 'xarray.core.indexing.LazilyIndexedArray'> is not implemented. Load your data first with .load() or .compute().

``` Here is the file: cesm_data.nc.zip

Expected Output

This used to work in v0.9

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.5.2.final.0 python-bits: 64 OS: Linux OS-release: 4.10.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0rc1 pandas: 0.21.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.0 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.4 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 36.6.0 pip: 9.0.1 conda: None pytest: 3.2.3 IPython: 6.2.1 sphinx: 1.6.5
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1688/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
231811609 MDU6SXNzdWUyMzE4MTE2MDk= 1429 Orthogonal indexing and MemoryCachedArray fmaussion 10050469 closed 0     5 2017-05-27T16:20:18Z 2017-11-06T17:21:56Z 2017-11-06T17:21:56Z MEMBER      

While working on https://github.com/pydata/xarray/pull/1260 I came upon this which looks like a bug in caching:

```python import numpy as np import xarray as xr from xarray.core import indexing

nx, ny = 8, 10 data = np.arange(nx*ny).reshape(ny, nx) cached = indexing.MemoryCachedArray(data)

data = xr.DataArray(data=data, dims=('y', 'x')) cached = xr.DataArray(data=cached, dims=('y', 'x'))

a = data.isel(x=[2, 4], y=[3, 5]) b = cached.isel(x=[2, 4], y=[3, 5]) ``` The last line raises:

```

AssertionError Traceback (most recent call last) <ipython-input-13-45cd1493cf6b> in <module>() 11 12 a = data.isel(x=[2, 4], y=[3, 5]) ---> 13 b = cached.isel(x=[2, 4], y=[3, 5])

/home/mowglie/Documents/git/xarray/xarray/core/dataarray.py in isel(self, drop, indexers) 668 DataArray.sel 669 """ --> 670 ds = self._to_temp_dataset().isel(drop=drop, indexers) 671 return self._from_temp_dataset(ds) 672

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in isel(self, drop, indexers) 1141 for name, var in iteritems(self._variables): 1142 var_indexers = dict((k, v) for k, v in indexers if k in var.dims) -> 1143 new_var = var.isel(var_indexers) 1144 if not (drop and name in var_indexers): 1145 variables[name] = new_var

/home/mowglie/Documents/git/xarray/xarray/core/variable.py in isel(self, **indexers) 547 if dim in indexers: 548 key[i] = indexers[dim] --> 549 return self[tuple(key)] 550 551 def squeeze(self, dim=None):

/home/mowglie/Documents/git/xarray/xarray/core/variable.py in getitem(self, key) 380 # orthogonal indexing should ensure the dimensionality is consistent 381 if hasattr(values, 'ndim'): --> 382 assert values.ndim == len(dims), (values.ndim, len(dims)) 383 else: 384 assert len(dims) == 0, len(dims)

AssertionError: (1, 2) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1429/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
201428093 MDU6SXNzdWUyMDE0MjgwOTM= 1215 to_netcdf() fails to append to an existing file fmaussion 10050469 closed 0     14 2017-01-17T22:45:45Z 2017-10-25T05:09:10Z 2017-10-25T05:09:10Z MEMBER      

The following code used to work well in v0.8.2:

```python import os import xarray as xr

path = 'test.nc' if os.path.exists(path): os.remove(path)

ds = xr.Dataset() ds['dim'] = ('dim', [0, 1, 2]) ds['var1'] = ('dim', [10, 11, 12]) ds.to_netcdf(path)

ds = xr.Dataset() ds['dim'] = ('dim', [0, 1, 2]) ds['var2'] = ('dim', [10, 11, 12]) ds.to_netcdf(path, 'a') ```

On master, it fails with:

```

RuntimeError Traceback (most recent call last) <ipython-input-1-fce5f5e876aa> in <module>() 14 ds['dim'] = ('dim', [0, 1, 2]) 15 ds['var2'] = ('dim', [10, 11, 12]) ---> 16 ds.to_netcdf(path, 'a')

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding) 927 from ..backends.api import to_netcdf 928 return to_netcdf(self, path, mode, format=format, group=group, --> 929 engine=engine, encoding=encoding) 930 931 def unicode(self):

/home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path, mode, format, group, engine, writer, encoding) 563 store = store_cls(path, mode, format, group, writer) 564 try: --> 565 dataset.dump_to_store(store, sync=sync, encoding=encoding) 566 if isinstance(path, BytesIO): 567 return path.getvalue()

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding) 873 variables, attrs = encoder(variables, attrs) 874 --> 875 store.store(variables, attrs, check_encoding) 876 if sync: 877 store.sync()

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set) 219 cf_variables, cf_attrs = cf_encoder(variables, attributes) 220 AbstractWritableDataStore.store(self, cf_variables, cf_attrs, --> 221 check_encoding_set) 222 223

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set) 194 def store(self, variables, attributes, check_encoding_set=frozenset()): 195 self.set_attributes(attributes) --> 196 self.set_variables(variables, check_encoding_set) 197 198 def set_attributes(self, attributes):

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in set_variables(self, variables, check_encoding_set) 204 name = _encode_variable_name(vn) 205 check = vn in check_encoding_set --> 206 target, source = self.prepare_variable(name, v, check) 207 self.writer.add(source, target) 208

/home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in prepare_variable(self, name, variable, check_encoding) 293 endian='native', 294 least_significant_digit=encoding.get('least_significant_digit'), --> 295 fill_value=fill_value) 296 nc4_var.set_auto_maskandscale(False) 297

netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.createVariable (netCDF4/_netCDF4.c:18740)()

netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.init (netCDF4/_netCDF4.c:30713)()

RuntimeError: NetCDF: String match to name in use ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1215/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
264209937 MDU6SXNzdWUyNjQyMDk5Mzc= 1620 repr of class methods fmaussion 10050469 closed 0     5 2017-10-10T12:32:10Z 2017-10-12T09:02:55Z 2017-10-12T09:02:55Z MEMBER      

Some live news from the classroom. A student (who is learning python and xarray at the same time) wanted to compute the minimum of an array and forgot the parenthesis (quite a common mistake).

The printout in the notebook in that case is:

```python

ds.toa_sw_all_mon.min <bound method ImplementsArrayReduce._reduce_method.\<locals>.wrapped_func of \<xarray.DataArray 'toa_sw_all_mon' (month: 12, lat: 180, lon: 360)> [777600 values with dtype=float32] Coordinates: * month (month) float32 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 * lat (lat) float32 -89.5 -88.5 -87.5 -86.5 -85.5 -84.5 -83.5 -82.5 ... * lon (lon) float32 -179.5 -178.5 -177.5 -176.5 -175.5 -174.5 -173.5 ... Attributes: long_name: Top of The Atmosphere Shortwave Flux, Monthly Means, All-... standard_name: TOA Shortwave Flux - All-Sky CF_name: toa_outgoing_shortwave_flux IPCC_name: none units: W m-2 valid_min: 0.00000 valid_max: 600.000> ```

which, I had to agree, is hard to identify as being a function. It's a detail, but is it necessary to print the entire repr of the array/dataset in the function repr?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1620/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
228821088 MDU6SXNzdWUyMjg4MjEwODg= 1409 Importing h5py corrupts xarray's IO fmaussion 10050469 closed 0     3 2017-05-15T19:37:13Z 2017-05-21T09:31:49Z 2017-05-21T09:31:49Z MEMBER      

Not sure if this is an xarray issue, a netCDF4 or a h5py one, but I found that importing h5py is not a good idea if you want to write to netcdf4 afterwards:

python import h5py import xarray as xr ds = xr.Dataset({'x': [1, 2, 3]}) ds.to_netcdf('test.nc4')

Errors with:

```

RuntimeError Traceback (most recent call last) <ipython-input-7-924aff64174f> in <module>() 2 import xarray as xr 3 ds = xr.Dataset({'x': [1, 2, 3]}) ----> 4 ds.to_netcdf('test.nc4')

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims) 974 return to_netcdf(self, path, mode, format=format, group=group, 975 engine=engine, encoding=encoding, --> 976 unlimited_dims=unlimited_dims) 977 978 def unicode(self):

/home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims) 581 try: 582 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 583 unlimited_dims=unlimited_dims) 584 if path_or_file is None: 585 return target.getvalue()

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims) 913 914 store.store(variables, attrs, check_encoding, --> 915 unlimited_dims=unlimited_dims) 916 if sync: 917 store.sync()

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, args, kwargs) 244 cf_variables, cf_attrs = cf_encoder(variables, attributes) 245 AbstractWritableDataStore.store(self, cf_variables, cf_attrs, --> 246 args, **kwargs) 247 248

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, unlimited_dims) 213 self.set_attributes(attributes) 214 self.set_variables(variables, check_encoding_set, --> 215 unlimited_dims=unlimited_dims) 216 217 def set_attributes(self, attributes):

/home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in set_variables(self, args, kwargs) 286 def set_variables(self, args, kwargs): 287 with self.ensure_open(autoclose=False): --> 288 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 289 290 def prepare_variable(self, name, variable, check_encoding=False,

/usr/lib/python3.5/contextlib.py in exit(self, type, value, traceback) 75 value = type() 76 try: ---> 77 self.gen.throw(type, value, traceback) 78 raise RuntimeError("generator didn't stop after throw()") 79 except StopIteration as exc:

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in ensure_open(self, autoclose) 282 self.close() 283 else: --> 284 yield 285 286 def assert_open(self):

/home/mowglie/Documents/git/xarray/xarray/backends/netCDF4_.py in set_variables(self, args, kwargs) 286 def set_variables(self, args, kwargs): 287 with self.ensure_open(autoclose=False): --> 288 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 289 290 def prepare_variable(self, name, variable, check_encoding=False,

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in set_variables(self, variables, check_encoding_set, unlimited_dims) 226 target, source = self.prepare_variable( 227 name, v, check, unlimited_dims=unlimited_dims) --> 228 self.writer.add(source, target) 229 230 def set_necessary_dimensions(self, variable, unlimited_dims=None):

/home/mowglie/Documents/git/xarray/xarray/backends/common.py in add(self, source, target) 167 else: 168 try: --> 169 target[...] = source 170 except TypeError: 171 # workaround for GH: scipy/scipy#6880

netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.setitem (netCDF4/_netCDF4.c:48315)()

netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable._put (netCDF4/_netCDF4.c:49808)()

RuntimeError: NetCDF: HDF error ```

Note that using engine='scipy' or omitting to import h5py doesn't throw an error of course.

For the record: - h5py: 2.7.0 - xarray: 0.9.5-5-gfd6e36e - system: linux

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1409/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
211888882 MDU6SXNzdWUyMTE4ODg4ODI= 1295 Terminology for the various coordinates fmaussion 10050469 closed 0     8 2017-03-04T16:12:53Z 2017-03-15T16:28:12Z 2017-03-15T16:28:12Z MEMBER      

Picking up a thread about the repr (https://github.com/pydata/xarray/issues/1199#issuecomment-272824929), I think it would be good to give a name to the two different types of coordinates in xarray.

Currently the doc says:

One dimensional coordinates with a name equal to their sole dimension (marked by * when printing a dataset or data array) take on a special meaning in xarray. They are used for label based indexing and alignment, like the index found on a pandas DataFrame or Series. Indeed, these “dimension” coordinates use a pandas.Index internally to store their values.

Other than for indexing, xarray does not make any direct use of the values associated with coordinates. Coordinates with names not matching a dimension are not used for alignment or indexing, nor are they required to match when doing arithmetic (see Coordinates).

The use of quotation marks in “dimension” coordinates makes the term imprecise. Should we simply call the former dimension coordinates and the latter optional coordinates?

This would also help to uniformize error reporting (e.g. https://github.com/pydata/xarray/pull/1291#discussion_r104261803)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1295/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
209082646 MDU6SXNzdWUyMDkwODI2NDY= 1280 Current doc builds are broken fmaussion 10050469 closed 0 fmaussion 10050469   2 2017-02-21T09:12:52Z 2017-03-08T10:42:21Z 2017-03-08T10:42:21Z MEMBER      

This is a RTD problem so out of our control, but I'll leave this issue opened until it is resolved.

See RTD issue: https://github.com/rtfd/readthedocs.org/issues/2651

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1280/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
200125945 MDU6SXNzdWUyMDAxMjU5NDU= 1199 Document the new __repr__ fmaussion 10050469 closed 0   0.9.0 2244472 23 2017-01-11T15:37:37Z 2017-01-30T17:41:34Z 2017-01-30T17:41:34Z MEMBER      

Sorry I missed that one when it was decided upon in https://github.com/pydata/xarray/pull/1017, but I think the changes in repr should be documented somewhere (at the minimum in the "Breaking Changes" section of what's new).

I just updated Salem for it to work well with xarray 0.9.0. The changes I had to make where quite small (that's a good thing), but it took me a bit of time to understand what was going on.

What I found confusing is following:

```python In [1]: import xarray as xr In [2]: ds = xr.DataArray([1, 2, 3]).to_dataset(name='var') In [3]: ds Out[3]: <xarray.Dataset> Dimensions: (dim_0: 3) Coordinates: o dim_0 (dim_0) - Data variables: var (dim_0) int64 1 2 3

In [4]: 'dim_0' in ds.coords Out[4]: False `dim_0is listed as coordinate, but'dim_0' in ds.coordsisFalse``. I think it should remain like this, but maybe we should document somewhere what the "o" and "*" mean?

(possibly here)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1199/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
201021909 MDU6SXNzdWUyMDEwMjE5MDk= 1211 __repr__ of 2D coordinates fmaussion 10050469 closed 0     2 2017-01-16T13:41:30Z 2017-01-17T11:37:12Z 2017-01-17T11:37:12Z MEMBER      

This is a minor issue (sorry to be so picky about the repr ;) )

Small 2D coordinates are represented in a weird way:

```python In [1]: import xarray as xr

In [2]: a = np.array([[1.1, 2.2, 3.3], [4.4, 5.5, 6.6]])

In [3]: da = xr.DataArray(a, dims=['y', 'x'], coords={'xy':(['y', 'x'], a)})

In [4]: da Out[4]: <xarray.DataArray (y: 2, x: 3)> array([[ 1.1, 2.2, 3.3], [ 4.4, 5.5, 6.6]]) Coordinates: xy (y, x) float64 1.1 2.2 3.3 4.4 5.5 6.6 o y (y) - o x (x) - ```

This line here:

Coordinates: xy (y, x) float64 1.1 2.2 3.3 4.4 5.5 6.6 Is a bit confusing as it flattens the coordinates. It's not a big deal though, just something to keep in mind maybe.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1211/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
188985482 MDU6SXNzdWUxODg5ODU0ODI= 1114 Converting rasm file to netCDF3 using xarray fmaussion 10050469 closed 0     6 2016-11-13T18:25:28Z 2016-12-27T22:24:28Z 2016-12-27T22:24:28Z MEMBER      

This would help new users like https://github.com/pydata/xarray/issues/1113 and simplify the RTD build process (https://github.com/pydata/xarray/issues/1106).

The problem is that it is not as trivial as expected. On the latest master:

python import xarray as xr ds = xr.tutorial.load_dataset('rasm') ds.to_netcdf('rasm.nc', format='NETCDF3_CLASSIC', engine='scipy') Throws an error:

```python

ValueError Traceback (most recent call last) /home/mowglie/Documents/git/xarray/xarray/backends/api.py in to_netcdf(dataset, path, mode, format, group, engine, writer, encoding) 516 try: --> 517 dataset.dump_to_store(store, sync=sync, encoding=encoding) 518 if isinstance(path, BytesIO):

/home/mowglie/Documents/git/xarray/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding) 754 if sync: --> 755 store.sync() 756

/home/mowglie/Documents/git/xarray/xarray/backends/scipy_.py in sync(self) 149 super(ScipyDataStore, self).sync() --> 150 self.ds.flush() 151

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in flush(self) 388 if hasattr(self, 'mode') and self.mode in 'wa': --> 389 self._write() 390 sync = flush

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write(self) 400 self._write_gatt_array() --> 401 self._write_var_array() 402

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write_var_array(self) 448 for name in variables: --> 449 self._write_var_metadata(name) 450 # Now that we have the metadata, we know the vsize of

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/scipy/io/netcdf.py in _write_var_metadata(self, name) 466 for dimname in var.dimensions: --> 467 dimid = self._dims.index(dimname) 468 self._pack_int(dimid)

ValueError: '2' is not in list ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1114/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
193226503 MDU6SXNzdWUxOTMyMjY1MDM= 1150 "ncdump -h" like repr? fmaussion 10050469 closed 0     4 2016-12-02T21:51:36Z 2016-12-23T17:36:54Z 2016-12-23T17:36:54Z MEMBER      

Sometimes it could be useful to have a view of all variables attributes at a glance. For example, this is the repr for ERA-Interim energy fluxes data:

(...) Data variables: slhf (month, latitude, longitude) float64 0.02852 0.02852 0.02852 ... tsr (month, latitude, longitude) float64 -0.0001912 -0.0001912 ... strd (month, latitude, longitude) float64 166.6 166.6 166.6 166.6 ... strc (month, latitude, longitude) float64 -66.23 -66.23 -66.23 ... tisr (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... ssrd (month, latitude, longitude) float64 -0.0003951 -0.0003951 ... ssrc (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... str (month, latitude, longitude) float64 -40.65 -40.65 -40.65 ... ttr (month, latitude, longitude) float64 -171.5 -171.5 -171.5 ... tsrc (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... sshf (month, latitude, longitude) float64 10.46 10.46 10.46 10.46 ... ssr (month, latitude, longitude) float64 0.0 0.0 0.0 0.0 0.0 0.0 ... ttrc (month, latitude, longitude) float64 -174.9 -174.9 -174.9 ... (...) This is what my students will see when they explore the dataset for the first time. It could be nice to have a utility function (e.g. dumph or something) which would have a style closer to ncdump -h:

(...) double ttr(month, latitude, longitude) ; ttr:least_significant_digit = 2L ; ttr:units = "J m**-2" ; ttr:long_name = "Top net thermal radiation" ; ttr:standard_name = "toa_outgoing_longwave_flux" ; double tsrc(month, latitude, longitude) ; tsrc:least_significant_digit = 2L ; tsrc:units = "J m**-2" ; tsrc:long_name = "Top net solar radiation, clear sky" ; double sshf(month, latitude, longitude) ; sshf:least_significant_digit = 2L ; sshf:units = "J m**-2" ; sshf:long_name = "Surface sensible heat flux" ; sshf:standard_name = "surface_upward_sensible_heat_flux" ; double ssr(month, latitude, longitude) ; ssr:least_significant_digit = 2L ; ssr:units = "J m**-2" ; ssr:long_name = "Surface net solar radiation" ; ssr:standard_name = "surface_net_downward_shortwave_flux" ; double ttrc(month, latitude, longitude) ; ttrc:least_significant_digit = 2L ; ttrc:units = "J m**-2" ; ttrc:long_name = "Top net thermal radiation, clear sky" ; (...)

Or is there something like this already?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1150/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
191829594 MDU6SXNzdWUxOTE4Mjk1OTQ= 1135 DOCS: broken "What's new" fmaussion 10050469 closed 0     3 2016-11-26T22:14:17Z 2016-11-26T23:07:36Z 2016-11-26T23:07:36Z MEMBER      

http://xarray.pydata.org/en/latest/whats-new.html See the examples at the bottom.

These are all old examples relying on the "xray" package. We can either remove these examples (my suggestion) or update them to use array.

As a general rule, I think that the what's new page shouldn't contain any code, at least not code that has to be run by RTD

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1135/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
188565022 MDU6SXNzdWUxODg1NjUwMjI= 1106 Getting netCDF4 to work on RTD fmaussion 10050469 closed 0     20 2016-11-10T17:07:35Z 2016-11-26T21:48:40Z 2016-11-26T21:48:40Z MEMBER      

This is to ping @ocefpaf on whether you have an idea on what's going on with netCDF4 on Read The Docs. See the import error here.

This is our conda config file: https://github.com/pydata/xarray/blob/master/doc/environment.yml

I've found related discussions here or here. Note that I never saw this when building salem's doc, which installs many more packages from conda-forge (see conf file).

Thanks a lot for your help! And no hurry.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1106/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
188831497 MDU6SXNzdWUxODg4MzE0OTc= 1111 Unable to decode time axis on rasm file fmaussion 10050469 closed 0     2 2016-11-11T19:23:53Z 2016-11-13T18:04:00Z 2016-11-13T18:04:00Z MEMBER      

```python import xarray as xr import netCDF4

print(xr.version) # 0.8.2-50-g57facab print(netCDF4.version) # 1.2.4

ds = xr.tutorial.load_dataset('rasm') /home/mowglie/Documents/git/xarray/xarray/conventions.py:389: RuntimeWarning: Unable to decode time axis into full numpy.datetime64 objects, continuing using dummy netCDF4.datetime objects instead, reason: dates out of range result = decode_cf_datetime(example_value, units, calendar) ```

I'll have a closer look.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1111/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
187200802 MDU6SXNzdWUxODcyMDA4MDI= 1078 Fascinating bug in contourf fmaussion 10050469 closed 0     3 2016-11-03T21:54:19Z 2016-11-03T22:15:46Z 2016-11-03T22:10:10Z MEMBER      

Can someone reproduce this or is it just me?

```python import matplotlib.pyplot as plt import numpy as np import xarray as xr import cartopy.crs as ccrs

nlats, nlons = (241, 480) lats = np.linspace(90, -90, nlats, dtype=np.float32) lons = np.linspace(-180, 180-0.75, nlons, dtype=np.float32) l1, l2 = np.meshgrid(lons, lats) data = xr.DataArray(l1 + l2, [('latitude', lats), ('longitude', lons)])

f = plt.figure()

ax1 = plt.subplot(2, 1, 1, projection=ccrs.Robinson()) data.plot.contourf(ax=ax1, transform=ccrs.PlateCarree()); ax1.coastlines(color='grey'); ax1.gridlines();

data += 180 # this is the line causing the problem

ax2 = plt.subplot(2, 1, 2, projection=ccrs.Robinson()) data.plot.contourf(ax=ax2, transform=ccrs.PlateCarree()); ax2.coastlines(color='grey'); ax2.gridlines();

plt.show() ``` Gives:

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1078/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
168876028 MDU6SXNzdWUxNjg4NzYwMjg= 933 rc 0.8: register_dataset_accessor silently ignores AttributeError fmaussion 10050469 closed 0     3 2016-08-02T12:44:46Z 2016-08-03T06:51:22Z 2016-08-03T06:51:22Z MEMBER      

If an AttributeError is raised at construction time by the accessor, it will be ignored silently:

``` python import xarray as xr

@xr.register_dataset_accessor('stupid_accessor') class GeoAccessor(object): def init(self, xarray_obj): self._obj = xarray_obj
raise AttributeError('Ups')

def plot(self):
    """Plot data on a map."""
    return 'plotting!'

ds = xr.Dataset({'longitude': np.linspace(0, 10), 'latitude': np.linspace(0, 20)}) ds.stupid_accessor.plot() ```

Will raise:

``` AttributeError Traceback (most recent call last) <ipython-input-1-b47ffc33cff2> in <module>() 12 13 ds = xr.Dataset({'longitude': np.linspace(0, 10), 'latitude': np.linspace(0, 20)}) ---> 14 ds.stupid_accessor.plot()

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/xarray/core/common.py in getattr(self, name) 192 return source[name] 193 raise AttributeError("%r object has no attribute %r" % --> 194 (type(self).name, name)) 195 196 def setattr(self, name, value):

AttributeError: 'Dataset' object has no attribute 'stupid_accessor' ```

The problem with this error message is that it doesn't tell you that the problem is in your code.

Interestingly, raising other exceptions like RuntimeError will result in the more informative traceback:

``` RuntimeError Traceback (most recent call last) <ipython-input-1-b575d0257332> in <module>() 12 13 ds = xr.Dataset({'longitude': np.linspace(0, 10), 'latitude': np.linspace(0, 20)}) ---> 14 ds.stupid_accessor.plot()

/home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/xarray/core/extensions.py in get(self, obj, cls) 17 # we're accessing the attribute of the class, i.e., Dataset.geo 18 return self._accessor ---> 19 accessor_obj = self._accessor(obj) 20 # Replace the property with the accessor object. Inspired by: 21 # http://www.pydanny.com/cached-property.html

<ipython-input-1-b575d0257332> in init(self, xarray_obj) 5 def init(self, xarray_obj): 6 self._obj = xarray_obj ----> 7 raise RuntimeError('Ups') 8 9 def plot(self):

RuntimeError: Ups ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/933/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
168865176 MDU6SXNzdWUxNjg4NjUxNzY= 932 Doc: link to Internals in "what's new" fmaussion 10050469 closed 0   0.8.0 1816292 1 2016-08-02T11:43:34Z 2016-08-02T17:38:28Z 2016-08-02T17:38:22Z MEMBER      

Very minor issue, but the link to Internals in the latest what's new redirects to pandas', not xarray's.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/932/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
117002929 MDU6SXNzdWUxMTcwMDI5Mjk= 657 Plotting on map projection much slower on v0.6.1 than 0.6.0 fmaussion 10050469 closed 0     19 2015-11-15T16:39:56Z 2016-06-15T21:10:38Z 2016-06-15T18:17:51Z MEMBER      

The following code snippet produces an average of ERA-Interim temperature on a map:

``` python import matplotlib.pyplot as plt import xray import cartopy.crs as ccrs import time

netcdf = xray.open_dataset('ERA-Int-Monthly-2mTemp.nc') t2_avg = netcdf.t2m.mean(dim='time')

start_time = time.time() ax = plt.axes(projection=ccrs.Robinson()) if xray.version == '0.6.0': t2_avg.plot(ax=ax, origin='upper', aspect='equal', transform=ccrs.PlateCarree()) else: t2_avg.plot(ax=ax, transform=ccrs.PlateCarree()) ax.coastlines() plt.savefig('t_xray.png') print("xray V{}: {:.2f} s".format(xray.version, time.time() - start_time)) ```

I've been very careful to check that my environments are exact same (mpl 1.4.3, numpy 1.10.1, cartopy 0.13.0).

See the output for V0.6.0 and 0.6.1 (output from the latest master is similar to 0.6.1):

0.6.0:

python python test_xray.py /home/mowglie/.bin/conda/envs/climate/lib/python3.4/site-packages/matplotlib/collections.py:590: FutureWarning: elementwise comparison failed; returning scalar instead, but in the future will perform elementwise comparison if self._edgecolors == str('face'): xray V0.6.0: 3.21 s

0.6.1:

python python test_xray.py /home/mowglie/.bin/conda/envs/upclim/lib/python3.4/site-packages/numpy/lib/shape_base.py:431: FutureWarning: in the future np.array_split will retain the shape of arrays with a zero size, instead of replacing them by `array([])`, which always has a shape of (0,). FutureWarning) /home/mowglie/.bin/conda/envs/upclim/lib/python3.4/site-packages/matplotlib/collections.py:590: FutureWarning: elementwise comparison failed; returning scalar instead, but in the future will perform elementwise comparison if self._edgecolors == str('face'): xray V0.6.1: 28.52 s

The first warning seems related to recent numpy. Note that a second warning appeared with xray V0.6.1.

It's interesting to mention that the bottleneck clearly is in the rendering (plt.savefig('t_xray.png')). Removing this line will make xray V0.6.1 faster than xray V0.6.0.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/657/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
114576415 MDU6SXNzdWUxMTQ1NzY0MTU= 642 Add a keyword to prevent divergent plot params fmaussion 10050469 closed 0     16 2015-11-02T11:27:44Z 2015-11-16T08:41:13Z 2015-11-16T08:41:13Z MEMBER      

I kinda agree with the default behaviour which is to consider the data as divergent when it contains "zero", but there should be an easy way to avoid this. Maybe I overlooked this but the utils._determine_cmap_params is quite intrusive:

``` python # Simple heuristics for whether these data should have a divergent map divergent = ((vmin < 0) and (vmax > 0)) or center is not None

# Now set center to 0 so math below makes sense
if center is None:
    center = 0

# A divergent map should be symmetric around the center value
if divergent:
    vlim = max(abs(vmin - center), abs(vmax - center))
    vmin, vmax = -vlim, vlim

```

This will override the user's vmin, vmax without giving him an opportunity to change this behaviour. What about a divergent=False keyword or something?

I'd be happy to submit a PR if found useful.

Thanks,

Fabien

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/642/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
115556712 MDU6SXNzdWUxMTU1NTY3MTI= 647 Plot example broken in 0.6.1 fmaussion 10050469 closed 0     11 2015-11-06T18:02:28Z 2015-11-12T17:39:54Z 2015-11-12T09:05:40Z MEMBER      

The plotting-with-cartopy example from the docs: http://xray.readthedocs.org/en/stable/plotting.html

is broken with the last conda install (xray 0.6.1, linux, python 3.4, mpl 1.4.3). It used to work fine in 0.6.0:

``` python (test_xray)mowglie@flappi ~ $ python test_xray.py Traceback (most recent call last): File "test_xray.py", line 17, in <module> transform=ccrs.PlateCarree()) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/xray/plot/plot.py", line 246, in call return plot(self._da, kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/xray/plot/plot.py", line 124, in plot return plotfunc(darray, kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/xray/plot/plot.py", line 417, in newplotfunc kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/xray/plot/plot.py", line 547, in pcolormesh primitive = ax.pcolormesh(x, y, z, kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/cartopy/mpl/geoaxes.py", line 1134, in pcolormesh result = self._pcolormesh_patched(args, kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/cartopy/mpl/geoaxes.py", line 1192, in _pcolormesh_patched antialiased=antialiased, shading=shading, kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/matplotlib/collections.py", line 1683, in init Collection.init(self, *kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/matplotlib/collections.py", line 135, in init self.update(kwargs) File "/home/mowglie/.bin/conda/envs/test_xray/lib/python3.4/site-packages/matplotlib/artist.py", line 757, in update raise AttributeError('Unknown property %s' % k) AttributeError: Unknown property origin

```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/647/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 37.111ms · About: xarray-datasette