home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

207 rows where user = 4295853 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

issue >30

  • Fixes OS error arising from too many files open 36
  • Integration with dask/distributed (xarray backend design) 15
  • Attributes from netCDF4 intialization retained 14
  • dask.async.RuntimeError: NetCDF: HDF error on xarray to_netcdf 13
  • Selection based on boolean DataArray 10
  • Add drop=True option for where on Dataset and DataArray 10
  • Array size changes following loading of numpy array 9
  • Adds cummulative operators to API 9
  • Marks slow, flaky, and failing tests 9
  • open_mfdataset too many files 8
  • Adding cumsum / cumprod reduction operators 8
  • WIP: progress toward making groupby work with multiple arguments 4
  • Groupby exclude dimension 4
  • Ensures drop=True case works with empty mask 4
  • Time limitation (between years 1678 and 2262) restrictive to climate community 3
  • Disable lock=True in open_mfdataset when reading netCDF3 files 3
  • New function for applying vectorized functions for unlabeled arrays to xarray objects 3
  • py.test fails on master 3
  • where(..., drop=True) failure for empty mask on python 2.7 3
  • Option for closing files with scipy backend 2
  • keep_attrs for Dataset.resample and DataArray.resample 2
  • Storing history of xarray operations 2
  • Alternate approach to serializing netcdfs for dask.distributed 2
  • More explicit check for dtype roundtripping in backends 2
  • Shape preserving `diff` via new keywords 2
  • where(..., drop=True) error 2
  • autoclose with distributed doesn't seem to work 2
  • Deprecated autoclose option 2
  • add average function 1
  • add scatter plot method to dataset 1
  • …

user 1

  • pwolfram · 207 ✖

author_association 1

  • CONTRIBUTOR 207
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
444599941 https://github.com/pydata/xarray/issues/2592#issuecomment-444599941 https://api.github.com/repos/pydata/xarray/issues/2592 MDEyOklzc3VlQ29tbWVudDQ0NDU5OTk0MQ== pwolfram 4295853 2018-12-05T18:54:10Z 2018-12-05T18:54:10Z CONTRIBUTOR

Thanks @jhamman! @xylar, I'm thinking file_cache_maxsize=1200 would be a good default (100 years of monthly files) that could be modified via the config files we use.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Deprecated autoclose option 387892184
444597829 https://github.com/pydata/xarray/issues/2592#issuecomment-444597829 https://api.github.com/repos/pydata/xarray/issues/2592 MDEyOklzc3VlQ29tbWVudDQ0NDU5NzgyOQ== pwolfram 4295853 2018-12-05T18:47:55Z 2018-12-05T18:47:55Z CONTRIBUTOR

@jhamman, LRU is much better. The api-change on our size is to replace autoclose=True with file_cache_maxsize=A_REASONABLE_BIG_NUMBER, correct?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Deprecated autoclose option 387892184
401405688 https://github.com/pydata/xarray/issues/470#issuecomment-401405688 https://api.github.com/repos/pydata/xarray/issues/470 MDEyOklzc3VlQ29tbWVudDQwMTQwNTY4OA== pwolfram 4295853 2018-06-29T16:27:02Z 2018-06-29T16:27:02Z CONTRIBUTOR

I agree this could be helpful... is there any interest in reviving this stale issue?

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add scatter plot method to dataset 94787306
371676376 https://github.com/pydata/xarray/pull/924#issuecomment-371676376 https://api.github.com/repos/pydata/xarray/issues/924 MDEyOklzc3VlQ29tbWVudDM3MTY3NjM3Ng== pwolfram 4295853 2018-03-09T00:51:30Z 2018-03-09T00:51:30Z CONTRIBUTOR

Thanks @shoyer, I find this feature extremely useful as I keep running into use cases where I can use it. Thanks for the update, given changes to xarray it sounds like the prudent course of action is as you outline. Thanks again for the quick reply!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: progress toward making groupby work with multiple arguments 168272291
371529015 https://github.com/pydata/xarray/pull/924#issuecomment-371529015 https://api.github.com/repos/pydata/xarray/issues/924 MDEyOklzc3VlQ29tbWVudDM3MTUyOTAxNQ== pwolfram 4295853 2018-03-08T15:51:38Z 2018-03-08T15:51:38Z CONTRIBUTOR

@shoyer, it looks like your list above is the place to start from your branch, correct?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: progress toward making groupby work with multiple arguments 168272291
371519787 https://github.com/pydata/xarray/pull/924#issuecomment-371519787 https://api.github.com/repos/pydata/xarray/issues/924 MDEyOklzc3VlQ29tbWVudDM3MTUxOTc4Nw== pwolfram 4295853 2018-03-08T15:23:25Z 2018-03-08T15:23:25Z CONTRIBUTOR

Just to refresh here-- what needs done to finish this off?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: progress toward making groupby work with multiple arguments 168272291
308129900 https://github.com/pydata/xarray/issues/1450#issuecomment-308129900 https://api.github.com/repos/pydata/xarray/issues/1450 MDEyOklzc3VlQ29tbWVudDMwODEyOTkwMA== pwolfram 4295853 2017-06-13T14:14:27Z 2017-06-13T14:14:27Z CONTRIBUTOR

Thanks @fmaussion and @shoyer, sorry about the duplicated issue...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Should an `apply` method exist for `DataArray` similar to the definition for `Dataset`? 235278888
298735223 https://github.com/pydata/xarray/issues/1394#issuecomment-298735223 https://api.github.com/repos/pydata/xarray/issues/1394 MDEyOklzc3VlQ29tbWVudDI5ODczNTIyMw== pwolfram 4295853 2017-05-02T19:24:07Z 2017-05-02T19:24:07Z CONTRIBUTOR

Note, we don't use decode_cf=False. Does it crash without making this specification, e.g., using the default?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  autoclose with distributed doesn't seem to work 225734529
298735070 https://github.com/pydata/xarray/issues/1394#issuecomment-298735070 https://api.github.com/repos/pydata/xarray/issues/1394 MDEyOklzc3VlQ29tbWVudDI5ODczNTA3MA== pwolfram 4295853 2017-05-02T19:23:30Z 2017-05-02T19:23:30Z CONTRIBUTOR

@rabernat, I would say that this is a bug. Is this with the scipy backend or netCDF4? Presumably if you have this problem we could run into too. For the record, we are using netCDF4.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  autoclose with distributed doesn't seem to work 225734529
293970117 https://github.com/pydata/xarray/issues/1350#issuecomment-293970117 https://api.github.com/repos/pydata/xarray/issues/1350 MDEyOklzc3VlQ29tbWVudDI5Mzk3MDExNw== pwolfram 4295853 2017-04-13T17:37:02Z 2017-04-13T17:37:02Z CONTRIBUTOR

Both cases are fixed by #1361.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  where(..., drop=True) error 219043002
293970014 https://github.com/pydata/xarray/pull/1361#issuecomment-293970014 https://api.github.com/repos/pydata/xarray/issues/1361 MDEyOklzc3VlQ29tbWVudDI5Mzk3MDAxNA== pwolfram 4295853 2017-04-13T17:36:35Z 2017-04-13T17:36:35Z CONTRIBUTOR

@shoyer, thanks for the fix here-- sorry this fell between the cracks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fix .where(drop=True) when arguments do not have indexes 220368276
293350309 https://github.com/pydata/xarray/pull/1355#issuecomment-293350309 https://api.github.com/repos/pydata/xarray/issues/1355 MDEyOklzc3VlQ29tbWVudDI5MzM1MDMwOQ== pwolfram 4295853 2017-04-11T18:08:38Z 2017-04-11T18:08:38Z CONTRIBUTOR

Thanks @shoyer and @jhamman! I've used this new citations for my papers in review. Awesome-- congratulations!!!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  update docs to reflect recently published paper on xarray 219938218
293148127 https://github.com/pydata/xarray/pull/1367#issuecomment-293148127 https://api.github.com/repos/pydata/xarray/issues/1367 MDEyOklzc3VlQ29tbWVudDI5MzE0ODEyNw== pwolfram 4295853 2017-04-11T04:41:54Z 2017-04-11T04:41:54Z CONTRIBUTOR

@shoyer, thanks for fixing this bug that I missed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fix open_dataarray does not pass properly its parameters to open_dataset 220533673
292646849 https://github.com/pydata/xarray/issues/422#issuecomment-292646849 https://api.github.com/repos/pydata/xarray/issues/422 MDEyOklzc3VlQ29tbWVudDI5MjY0Njg0OQ== pwolfram 4295853 2017-04-07T20:43:48Z 2017-04-07T20:43:48Z CONTRIBUTOR

@mathause can you please comment on the status of this issue? Is there an associated PR somewhere? Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add average function 84127296
291258269 https://github.com/pydata/xarray/issues/1350#issuecomment-291258269 https://api.github.com/repos/pydata/xarray/issues/1350 MDEyOklzc3VlQ29tbWVudDI5MTI1ODI2OQ== pwolfram 4295853 2017-04-03T20:07:13Z 2017-04-03T20:07:13Z CONTRIBUTOR

As it turns out this was found this hunting a more complicated bug, but I couldn't easily reproduce it via DataArrays from scratch. My guess is that solving this will most likely solve that issue:

```python ds = xr.open_dataset('mpaso.hist.0100-06-01_00000_potDensityOnly.nc') ds['latCell'] = ds.latCell*180.0/np.pi ds.set_coords('latCell', inplace=True)

potDen = ds.potentialDensity.where(ds.potentialDensity != 0, drop=True)

yieldingpython Traceback (most recent call last): File "./plot_potential_density.py", line 52, in <module> main() File "./plot_potential_density.py", line 16, in main potDen = ds.potentialDensity.where(ds.potentialDensity != 0, drop=True) File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/common.py", line 627, in where outobj = self.sel(indexers) File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/dataarray.py", line 672, in sel result = self.isel(drop=drop, pos_indexers) File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/dataarray.py", line 657, in isel ds = self._to_temp_dataset().isel(drop=drop, indexers) File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/dataset.py", line 1119, in isel new_var = var.isel(var_indexers) File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/variable.py", line 548, in isel return self[tuple(key)] File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/variable.py", line 378, in getitem values = self._indexable_data[key] File "/Users/pwolfram/anaconda/lib/python2.7/site-packages/xarray/core/indexing.py", line 423, in getitem return type(self)(self.array[key]) IndexError: shape mismatch: indexing arrays could not be broadcast together with shapes (1,) (235446,) (100,) ```

It is possible that ds.set_coords('latCell', inplace=True) is part of the problem for this bug, so it may actually be a few problems, not just one.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  where(..., drop=True) error 219043002
291020601 https://github.com/pydata/xarray/pull/1342#issuecomment-291020601 https://api.github.com/repos/pydata/xarray/issues/1342 MDEyOklzc3VlQ29tbWVudDI5MTAyMDYwMQ== pwolfram 4295853 2017-04-02T22:45:01Z 2017-04-02T22:45:01Z CONTRIBUTOR

Thanks @shoyer!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Ensures drop=True case works with empty mask 218291642
291018928 https://github.com/pydata/xarray/pull/1336#issuecomment-291018928 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI5MTAxODkyOA== pwolfram 4295853 2017-04-02T22:12:28Z 2017-04-02T22:12:28Z CONTRIBUTOR

@shoyer, all checks pass and this is ready for a review / merge when you have time.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
290988338 https://github.com/pydata/xarray/pull/1336#issuecomment-290988338 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI5MDk4ODMzOA== pwolfram 4295853 2017-04-02T14:03:03Z 2017-04-02T14:03:03Z CONTRIBUTOR

@shoyer, changes have been made as requested. Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
290987246 https://github.com/pydata/xarray/pull/1342#issuecomment-290987246 https://api.github.com/repos/pydata/xarray/issues/1342 MDEyOklzc3VlQ29tbWVudDI5MDk4NzI0Ng== pwolfram 4295853 2017-04-02T13:43:23Z 2017-04-02T13:43:23Z CONTRIBUTOR

@shoyer, the bug fix note has now been added and commits merged! Thank you!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Ensures drop=True case works with empty mask 218291642
290601925 https://github.com/pydata/xarray/pull/1038#issuecomment-290601925 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI5MDYwMTkyNQ== pwolfram 4295853 2017-03-31T02:53:30Z 2017-03-31T02:53:30Z CONTRIBUTOR

@shoyer, tests should be restarted following merge of #1336 and this PR should be ready to merge.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
290600174 https://github.com/pydata/xarray/pull/1342#issuecomment-290600174 https://api.github.com/repos/pydata/xarray/issues/1342 MDEyOklzc3VlQ29tbWVudDI5MDYwMDE3NA== pwolfram 4295853 2017-03-31T02:40:02Z 2017-03-31T02:40:08Z CONTRIBUTOR

@shoyer the changes have been made and this should be ready to merge now.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Ensures drop=True case works with empty mask 218291642
290571024 https://github.com/pydata/xarray/pull/1336#issuecomment-290571024 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI5MDU3MTAyNA== pwolfram 4295853 2017-03-30T23:12:15Z 2017-03-30T23:12:15Z CONTRIBUTOR

This also fixes the issue noted in https://github.com/pydata/xarray/pull/1038 where flakey tests cause travis CI failure.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
290570908 https://github.com/pydata/xarray/pull/1336#issuecomment-290570908 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI5MDU3MDkwOA== pwolfram 4295853 2017-03-30T23:11:29Z 2017-03-30T23:11:29Z CONTRIBUTOR

@shoyer, as we discussed, here is a robustness-ing of the testing as needed for 0.9.2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
290566932 https://github.com/pydata/xarray/pull/1342#issuecomment-290566932 https://api.github.com/repos/pydata/xarray/issues/1342 MDEyOklzc3VlQ29tbWVudDI5MDU2NjkzMg== pwolfram 4295853 2017-03-30T22:48:33Z 2017-03-30T22:48:33Z CONTRIBUTOR

@shoyer, note that we probably need this fix to be included in the 0.9.2 release.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Ensures drop=True case works with empty mask 218291642
290507367 https://github.com/pydata/xarray/issues/1341#issuecomment-290507367 https://api.github.com/repos/pydata/xarray/issues/1341 MDEyOklzc3VlQ29tbWVudDI5MDUwNzM2Nw== pwolfram 4295853 2017-03-30T18:46:05Z 2017-03-30T18:46:05Z CONTRIBUTOR

Thanks @shoyer, the test stub is at https://github.com/pydata/xarray/pull/1342 and hopefully CI will give some more useful information.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  where(..., drop=True) failure for empty mask on python 2.7 218277814
290492243 https://github.com/pydata/xarray/issues/1341#issuecomment-290492243 https://api.github.com/repos/pydata/xarray/issues/1341 MDEyOklzc3VlQ29tbWVudDI5MDQ5MjI0Mw== pwolfram 4295853 2017-03-30T17:59:17Z 2017-03-30T17:59:17Z CONTRIBUTOR

@shoyer, is there any easy way to initialize python <xarray.DataArray (nCells: 0, nVertLevels: 10)> array([], shape=(0, 10), dtype=float64) Dimensions without coordinates: nCells, nVertLevels so that I can write a clean test for this error?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  where(..., drop=True) failure for empty mask on python 2.7 218277814
290491964 https://github.com/pydata/xarray/issues/1341#issuecomment-290491964 https://api.github.com/repos/pydata/xarray/issues/1341 MDEyOklzc3VlQ29tbWVudDI5MDQ5MTk2NA== pwolfram 4295853 2017-03-30T17:58:31Z 2017-03-30T17:58:31Z CONTRIBUTOR

The script (e.g., on 3.5) should return something like ```python In [1]: import xarray as xr

In [2]: import numpy as np

In [3]: da = xr.DataArray(np.random.rand(100,10), dims=['nCells','nVertLevels'])

In [4]: mask = xr.DataArray(np.zeros((100,), dtype='bool'), dims='nCells')

In [5]: da.where(mask, drop=True)

Out[6]: <xarray.DataArray (nCells: 0, nVertLevels: 10)> array([], shape=(0, 10), dtype=float64) Dimensions without coordinates: nCells, nVertLevels ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  where(..., drop=True) failure for empty mask on python 2.7 218277814
290237717 https://github.com/pydata/xarray/issues/1338#issuecomment-290237717 https://api.github.com/repos/pydata/xarray/issues/1338 MDEyOklzc3VlQ29tbWVudDI5MDIzNzcxNw== pwolfram 4295853 2017-03-29T21:52:16Z 2017-03-29T21:52:16Z CONTRIBUTOR

Thanks @shoyer. This is what I was thinking but I just wanted to double check. Feel free to close the issue if you would like- I'll move this to dask issues.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Chunking and dask memory errors 218013400
290193129 https://github.com/pydata/xarray/pull/1336#issuecomment-290193129 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI5MDE5MzEyOQ== pwolfram 4295853 2017-03-29T19:05:04Z 2017-03-29T19:05:04Z CONTRIBUTOR

@shoyer, is there anything else you would like done on this before merging?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
289880785 https://github.com/pydata/xarray/pull/1336#issuecomment-289880785 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI4OTg4MDc4NQ== pwolfram 4295853 2017-03-28T19:37:12Z 2017-03-28T19:37:12Z CONTRIBUTOR

@shoyer, all tests pass and I've spot checked that this works as expected, e.g., ```python xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_autoclose_netcdf4 PASSED

xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_open_large_num_files_netcdf4 SKIPPED

xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_autoclose_scipy PASSED

xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_open_large_num_files_scipy SKIPPED

xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_autoclose_pynio PASSED

xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio SKIPPED ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
289874334 https://github.com/pydata/xarray/pull/1336#issuecomment-289874334 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI4OTg3NDMzNA== pwolfram 4295853 2017-03-28T19:12:21Z 2017-03-28T19:12:30Z CONTRIBUTOR

@shoyer et al, please feel free to mark additional tests as slow in this PR. I only marked the ones from #1198 that were too slow for this round.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
289873207 https://github.com/pydata/xarray/pull/1038#issuecomment-289873207 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTg3MzIwNw== pwolfram 4295853 2017-03-28T19:07:50Z 2017-03-28T19:07:50Z CONTRIBUTOR

See #1336 for a fix that disables these tests that have been acting up because of resource issues.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289872994 https://github.com/pydata/xarray/pull/1336#issuecomment-289872994 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI4OTg3Mjk5NA== pwolfram 4295853 2017-03-28T19:07:01Z 2017-03-28T19:07:01Z CONTRIBUTOR

cc @MaximilianR

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
289872510 https://github.com/pydata/xarray/pull/1336#issuecomment-289872510 https://api.github.com/repos/pydata/xarray/issues/1336 MDEyOklzc3VlQ29tbWVudDI4OTg3MjUxMA== pwolfram 4295853 2017-03-28T19:05:14Z 2017-03-28T19:05:14Z CONTRIBUTOR

@shoyer, this PR should also fix the issue with travis CI failing due to testing of too many open files. Note, when a user runs py.test locally, the slow tests will run. They can be skipped via running py.test --skip-slow.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Marks slow, flaky, and failing tests 217660739
289840161 https://github.com/pydata/xarray/issues/1332#issuecomment-289840161 https://api.github.com/repos/pydata/xarray/issues/1332 MDEyOklzc3VlQ29tbWVudDI4OTg0MDE2MQ== pwolfram 4295853 2017-03-28T17:14:29Z 2017-03-28T17:14:29Z CONTRIBUTOR

@rabernat, do you think that the proposed keyword additions should be included in xarray or not? I personally would like to see them in xarray but don't know if it is just me or not. If you think they should be in xarray, are you ok with the api above?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Shape preserving `diff` via new keywords 217385961
289833779 https://github.com/pydata/xarray/issues/1332#issuecomment-289833779 https://api.github.com/repos/pydata/xarray/issues/1332 MDEyOklzc3VlQ29tbWVudDI4OTgzMzc3OQ== pwolfram 4295853 2017-03-28T16:52:25Z 2017-03-28T16:52:25Z CONTRIBUTOR

@shoyer, I'm not sure we want to wrap np.gradient. It seems like other approaches like @rabernat 's xgcm would be more appropriate as a superset of xarray.

Fundamentally, I want something that is like an inverse of cumsum and the proposed change could be used in that context. It is just super inconvenient to do array resizing following the diff of a time vector to get timesteps, but maybe this use case is too niche to be useful for the community.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Shape preserving `diff` via new keywords 217385961
289832319 https://github.com/pydata/xarray/issues/1335#issuecomment-289832319 https://api.github.com/repos/pydata/xarray/issues/1335 MDEyOklzc3VlQ29tbWVudDI4OTgzMjMxOQ== pwolfram 4295853 2017-03-28T16:47:36Z 2017-03-28T16:47:36Z CONTRIBUTOR

Thanks @fmaussion, it just seems strange that if the data is one-dimensional we return an error. I would agree that we probably want an error for dimensionality larger than one. I think the thing to change here is to make da.cumsum() work for one-dimensional da. But, it may not be worth the effort to fix this issue relative to other issues.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  `cumsum` providing correct behavior for non-coordinate DataArrays? 217584777
289116553 https://github.com/pydata/xarray/pull/1038#issuecomment-289116553 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTExNjU1Mw== pwolfram 4295853 2017-03-24T19:04:42Z 2017-03-24T19:04:42Z CONTRIBUTOR

Crash in the same place... but when I restarted it via a force push earlier it passed, which would imply we are running out of resources on travis.

Maybe the thing to do is just to do a reset on the open file limit as @rabernat suggested, this way it provides a factor of safety on travis.

Thoughts on this idea @shoyer and @fmaussion?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289114015 https://github.com/pydata/xarray/pull/1038#issuecomment-289114015 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTExNDAxNQ== pwolfram 4295853 2017-03-24T18:54:45Z 2017-03-24T18:54:45Z CONTRIBUTOR

Is it possible that the test fails if more than one is simultaneously run on the same node? Could you restart the other tests to verify (restart at the same time if possible).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289113304 https://github.com/pydata/xarray/pull/1038#issuecomment-289113304 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTExMzMwNA== pwolfram 4295853 2017-03-24T18:52:07Z 2017-03-24T18:52:07Z CONTRIBUTOR

Still passing locally... xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_autoclose_netcdf4 PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_1_open_large_num_files_netcdf4 PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_autoclose_scipy PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_2_open_large_num_files_scipy PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_autoclose_pynio PASSED xarray/tests/test_backends.py::OpenMFDatasetManyFilesTest::test_3_open_large_num_files_pynio PASSED Test passes even if I run it multiple times too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289111963 https://github.com/pydata/xarray/pull/1038#issuecomment-289111963 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTExMTk2Mw== pwolfram 4295853 2017-03-24T18:46:49Z 2017-03-24T18:47:26Z CONTRIBUTOR

I'm continuing to take a look-- my tests were not 100% set up locally on this branch and I'll see if I can reproduce the sporadic error on macOS.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289109435 https://github.com/pydata/xarray/pull/1038#issuecomment-289109435 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTEwOTQzNQ== pwolfram 4295853 2017-03-24T18:36:35Z 2017-03-24T18:36:35Z CONTRIBUTOR

@shoyer, should I do a quick "hot fix" and then try to sort out the problem?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289108876 https://github.com/pydata/xarray/pull/1038#issuecomment-289108876 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTEwODg3Ng== pwolfram 4295853 2017-03-24T18:34:13Z 2017-03-24T18:34:13Z CONTRIBUTOR

It happened here too... I just tried it out on my local machine via conda env create -f ci/requirements-py27-cdat+pynio.yml and wasn't able to get an error... are any of the crashes better then a "seg fault"?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289080885 https://github.com/pydata/xarray/pull/1038#issuecomment-289080885 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4OTA4MDg4NQ== pwolfram 4295853 2017-03-24T16:58:43Z 2017-03-24T16:58:43Z CONTRIBUTOR

@shoyer, added a test as requested.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
289054173 https://github.com/pydata/xarray/issues/1167#issuecomment-289054173 https://api.github.com/repos/pydata/xarray/issues/1167 MDEyOklzc3VlQ29tbWVudDI4OTA1NDE3Mw== pwolfram 4295853 2017-03-24T15:24:31Z 2017-03-24T15:24:31Z CONTRIBUTOR

I think we can close this because 0.9.0 has been released.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Things to complete before releasing xarray v0.9.0 195971828
288867744 https://github.com/pydata/xarray/issues/463#issuecomment-288867744 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI4ODg2Nzc0NA== pwolfram 4295853 2017-03-23T21:36:07Z 2017-03-23T21:36:07Z CONTRIBUTOR

@ajoros should correct me if I'm wrong but it sounds like everything is working for his use case.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498
288832707 https://github.com/pydata/xarray/issues/463#issuecomment-288832707 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI4ODgzMjcwNw== pwolfram 4295853 2017-03-23T19:21:57Z 2017-03-23T19:21:57Z CONTRIBUTOR

@ajoros, #1198 was just merged so the bleeding-edge version of xarray is the one to try!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498
288832565 https://github.com/pydata/xarray/pull/1198#issuecomment-288832565 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODgzMjU2NQ== pwolfram 4295853 2017-03-23T19:21:25Z 2017-03-23T19:21:25Z CONTRIBUTOR

Thanks a bunch @shoyer!

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288830741 https://github.com/pydata/xarray/issues/463#issuecomment-288830741 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI4ODgzMDc0MQ== pwolfram 4295853 2017-03-23T19:14:23Z 2017-03-23T19:14:23Z CONTRIBUTOR

@ajoros, can you try something like pip -v install --force git+ssh://git@github.com/pwolfram/xarray@fix_too_many_open_files to see if #1198 fixes your problem with your dataset, noting that you need open_mfdataset(..., autoclose=True)?

@shoyer should correct me if I'm wrong but we are almost ready to merge the code in this PR and this would be a great "in the field" check if you could try it out soon.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498
288828445 https://github.com/pydata/xarray/issues/1319#issuecomment-288828445 https://api.github.com/repos/pydata/xarray/issues/1319 MDEyOklzc3VlQ29tbWVudDI4ODgyODQ0NQ== pwolfram 4295853 2017-03-23T19:06:06Z 2017-03-23T19:06:06Z CONTRIBUTOR

As long as we can explicitly obtain the attrs data if necessary truncating the data for the repr makes sense. Note that appending a set of characters like ... to the end of it would be useful to indicate that the string continues but is only partially displayed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Truncate long lines in repr of Dataset.attrs 216329175
288796995 https://github.com/pydata/xarray/pull/1198#issuecomment-288796995 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODc5Njk5NQ== pwolfram 4295853 2017-03-23T17:26:23Z 2017-03-23T17:26:23Z CONTRIBUTOR

@shoyer, all tests (including coveralls) passed. Please let me know if you have additional concerns and if we could merge fairly soon, e.g., because of https://github.com/MPAS-Dev/MPAS-Analysis/issues/151 I would really appreciate it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288795432 https://github.com/pydata/xarray/pull/1198#issuecomment-288795432 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODc5NTQzMg== pwolfram 4295853 2017-03-23T17:22:11Z 2017-03-23T17:22:11Z CONTRIBUTOR

@shoyer, this is ready for the final review now. Coveralls appears to have hung but other tests pass.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288791152 https://github.com/pydata/xarray/pull/1198#issuecomment-288791152 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODc5MTE1Mg== pwolfram 4295853 2017-03-23T17:08:41Z 2017-03-23T17:08:41Z CONTRIBUTOR

@shoyer, I had a minor bug that is now removed. The last caveat no longer applicable:

The scipy backend can handle objects like BytesIO that really aren't file handles and there doesn't appear to be a clean way to close these types of objects. So, at present I'm explicitly setting _autoclose=False if they are encountered in the datastore. If this needs to be changed, particularly since it doesn't affect existing behavior, I'd prefer this be resolved in a separate issue / PR if possible.

I'll let you know when tests pass and this is ready for your final review.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288782907 https://github.com/pydata/xarray/pull/1198#issuecomment-288782907 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODc4MjkwNw== pwolfram 4295853 2017-03-23T16:44:44Z 2017-03-23T16:44:44Z CONTRIBUTOR

@shoyer, that subclass-based approach you outlined worked (fixture parameters really don't work with classes as far as I could tell). We now have more comprehensive, named testing. Note, there was one minor point that required more explicitly specification that arose from the more rigorous testing:

The scipy backend can handle objects like BytesIO that really aren't file handles and there doesn't appear to be a clean way to close these types of objects. So, at present I'm explicitly setting _autoclose=False if they are encountered in the datastore. If this needs to be changed, particularly since it doesn't affect existing behavior, I'd prefer this be resolved in a separate issue / PR if possible.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288559231 https://github.com/pydata/xarray/pull/1198#issuecomment-288559231 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODU1OTIzMQ== pwolfram 4295853 2017-03-22T22:26:40Z 2017-03-22T22:26:40Z CONTRIBUTOR

@shoyer, if we generally cover test_backends for autoclose=True, then we should get the pickle testing for free: xarray/tests/test_backends.py:181: def test_pickle(self): xarray/tests/test_backends.py:191: def test_pickle_dataarray(self): xarray/tests/test_backends.py:792: def test_bytesio_pickle(self): or was there some other test that is needed?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288433091 https://github.com/pydata/xarray/pull/1038#issuecomment-288433091 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4ODQzMzA5MQ== pwolfram 4295853 2017-03-22T15:21:07Z 2017-03-22T15:21:07Z CONTRIBUTOR

Provided checks pass this should be ready to merge @fmaussion unless @shoyer has any additional recommended changes.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
288432628 https://github.com/pydata/xarray/pull/1038#issuecomment-288432628 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4ODQzMjYyOA== pwolfram 4295853 2017-03-22T15:19:45Z 2017-03-22T15:19:45Z CONTRIBUTOR

Note, I would say that open_mfdataset is no longer experimental because of its widespread use.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
288427192 https://github.com/pydata/xarray/issues/1013#issuecomment-288427192 https://api.github.com/repos/pydata/xarray/issues/1013 MDEyOklzc3VlQ29tbWVudDI4ODQyNzE5Mg== pwolfram 4295853 2017-03-22T15:03:07Z 2017-03-22T15:03:07Z CONTRIBUTOR

Note, this issue should be resolvable via #924.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Groupby exclude dimension 178200674
288423681 https://github.com/pydata/xarray/pull/1038#issuecomment-288423681 https://api.github.com/repos/pydata/xarray/issues/1038 MDEyOklzc3VlQ29tbWVudDI4ODQyMzY4MQ== pwolfram 4295853 2017-03-22T14:52:34Z 2017-03-22T14:52:34Z CONTRIBUTOR

@fmaussion and @shoyer, I'd like to close this PR out if possible. I'm not 100% sure this PR is worthwhile to complete in a general fashion because of the ambiguity in how to best handle this issue. My current take on this would be to go with whatever is simplest / cleanest, at least in the short term, which is @fmaussion's suggestion above. Does this work for you both?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Attributes from netCDF4 intialization retained 181033674
288414991 https://github.com/pydata/xarray/issues/463#issuecomment-288414991 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI4ODQxNDk5MQ== pwolfram 4295853 2017-03-22T14:25:37Z 2017-03-22T14:25:37Z CONTRIBUTOR

We are very close on #1198 and will be merging soon. This would be a great time for everyone to ensure that #1198 resolves this issue before we merge.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498
288414396 https://github.com/pydata/xarray/issues/798#issuecomment-288414396 https://api.github.com/repos/pydata/xarray/issues/798 MDEyOklzc3VlQ29tbWVudDI4ODQxNDM5Ng== pwolfram 4295853 2017-03-22T14:23:45Z 2017-03-22T14:23:45Z CONTRIBUTOR

@mrocklin and @shoyer, we now have dask.distributed and xarray support. Should this issue be closed?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Integration with dask/distributed (xarray backend design) 142498006
288141361 https://github.com/pydata/xarray/pull/1198#issuecomment-288141361 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODE0MTM2MQ== pwolfram 4295853 2017-03-21T16:45:57Z 2017-03-21T16:45:57Z CONTRIBUTOR

Thanks @shoyer. Ok, so we pass checks following merge of #1311. Is there anything else that we need to do on this PR prior to merging?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288127346 https://github.com/pydata/xarray/pull/1311#issuecomment-288127346 https://api.github.com/repos/pydata/xarray/issues/1311 MDEyOklzc3VlQ29tbWVudDI4ODEyNzM0Ng== pwolfram 4295853 2017-03-21T16:03:04Z 2017-03-21T16:03:04Z CONTRIBUTOR

Thanks!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  More explicit check for dtype roundtripping in backends 214761059
288103835 https://github.com/pydata/xarray/issues/1309#issuecomment-288103835 https://api.github.com/repos/pydata/xarray/issues/1309 MDEyOklzc3VlQ29tbWVudDI4ODEwMzgzNQ== pwolfram 4295853 2017-03-21T14:53:38Z 2017-03-21T14:53:38Z CONTRIBUTOR

@shoyer, this seems pretty straight forward from http://doc.pytest.org/en/latest/example/simple.html#control-skipping-of-tests-according-to-command-line-option-- all we need is to define the @slow decorator and mark the slow tests, correct? This seems like a fast PR write-up / fulfill following merge of #1198 unless I'm missing something.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Mark slow tests and don't run them by default 214201008
288102478 https://github.com/pydata/xarray/pull/1311#issuecomment-288102478 https://api.github.com/repos/pydata/xarray/issues/1311 MDEyOklzc3VlQ29tbWVudDI4ODEwMjQ3OA== pwolfram 4295853 2017-03-21T14:49:41Z 2017-03-21T14:49:41Z CONTRIBUTOR

@shoyer, should this PR be merged before #1198, just in case to make sure that there aren't any "gotchas" that I've missed in #1198?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  More explicit check for dtype roundtripping in backends 214761059
288102017 https://github.com/pydata/xarray/pull/1198#issuecomment-288102017 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODEwMjAxNw== pwolfram 4295853 2017-03-21T14:48:21Z 2017-03-21T14:48:21Z CONTRIBUTOR

@shoyer, we can roll back the squash if you want because for whatever reason my interactive rebase removed your name from tagged to the commit too, which is not what I expected. This is obviously suboptimal and the choice of having two commits versus one is up to you. I think the commit prior to the squash was 9c274eb.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
288101049 https://github.com/pydata/xarray/pull/1198#issuecomment-288101049 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4ODEwMTA0OQ== pwolfram 4295853 2017-03-21T14:45:28Z 2017-03-21T14:45:28Z CONTRIBUTOR

The tests passed so I will squash the commits...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287165679 https://github.com/pydata/xarray/pull/1198#issuecomment-287165679 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzE2NTY3OQ== pwolfram 4295853 2017-03-16T19:27:22Z 2017-03-16T19:27:22Z CONTRIBUTOR

@shoyer, all tests are passing now following your recommended edits.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287163231 https://github.com/pydata/xarray/pull/1198#issuecomment-287163231 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzE2MzIzMQ== pwolfram 4295853 2017-03-16T19:17:44Z 2017-03-16T19:17:44Z CONTRIBUTOR

@shoyer- can you please have another look following edits as you requested?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287142757 https://github.com/pydata/xarray/pull/1198#issuecomment-287142757 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzE0Mjc1Nw== pwolfram 4295853 2017-03-16T18:03:32Z 2017-03-16T18:03:32Z CONTRIBUTOR

Note: tests pass before submitting changes.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287121290 https://github.com/pydata/xarray/pull/1198#issuecomment-287121290 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzEyMTI5MA== pwolfram 4295853 2017-03-16T16:54:19Z 2017-03-16T16:54:19Z CONTRIBUTOR

They did- thank you! I'll take a quick look at your comments now so we can hopefully get this merged soon.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287112491 https://github.com/pydata/xarray/pull/1198#issuecomment-287112491 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzExMjQ5MQ== pwolfram 4295853 2017-03-16T16:26:09Z 2017-03-16T16:26:09Z CONTRIBUTOR

@shoyer, do you mind restarting the Travis-CI?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287100278 https://github.com/pydata/xarray/pull/1198#issuecomment-287100278 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzEwMDI3OA== pwolfram 4295853 2017-03-16T15:47:53Z 2017-03-16T15:47:53Z CONTRIBUTOR

Awesome, thanks @vnoel for testing this for us all on your dataset!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287090864 https://github.com/pydata/xarray/pull/1198#issuecomment-287090864 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzA5MDg2NA== pwolfram 4295853 2017-03-16T15:19:37Z 2017-03-16T15:19:37Z CONTRIBUTOR

Is something wrong with travis-ci right now? It doesn't look like it is running the tests...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287061697 https://github.com/pydata/xarray/pull/1198#issuecomment-287061697 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzA2MTY5Nw== pwolfram 4295853 2017-03-16T13:48:09Z 2017-03-16T13:48:09Z CONTRIBUTOR

Note, the force push was just to get travis-ci to "reboot". There was no real change to the code.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286637311 https://github.com/pydata/xarray/pull/1198#issuecomment-286637311 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjYzNzMxMQ== pwolfram 4295853 2017-03-15T04:22:28Z 2017-03-16T13:47:46Z CONTRIBUTOR

Note, if we include 482ef54 for resource-limited testing we will need to do something special for windows on appveyor. We could skip this in the short-term to make this merge and deal with these types of issues in #1309, which is what probably makes the most sense to help keep the scope of this PR more limited.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
287061175 https://github.com/pydata/xarray/pull/1198#issuecomment-287061175 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NzA2MTE3NQ== pwolfram 4295853 2017-03-16T13:46:18Z 2017-03-16T13:46:18Z CONTRIBUTOR

@vnoel, did you use autoclose=True? Can you please share the code snippet you used to instantiate this?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286831879 https://github.com/pydata/xarray/pull/1198#issuecomment-286831879 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjgzMTg3OQ== pwolfram 4295853 2017-03-15T18:12:28Z 2017-03-15T18:12:28Z CONTRIBUTOR

@rabernat, I suspect you have some datasets that could be used to stress test this PR too... thanks for the help and advice yesterday.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286831672 https://github.com/pydata/xarray/pull/1198#issuecomment-286831672 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjgzMTY3Mg== pwolfram 4295853 2017-03-15T18:11:53Z 2017-03-15T18:11:53Z CONTRIBUTOR

@PeterDSteinberg and @vnoel, do you mind stress-testing this PR to make sure all works as expected? You can install easily via pip with pip -v install git+ssh://git@github.com/pwolfram/xarray@fix_too_many_open_files

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286824521 https://github.com/pydata/xarray/pull/1198#issuecomment-286824521 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjgyNDUyMQ== pwolfram 4295853 2017-03-15T17:48:42Z 2017-03-15T17:48:42Z CONTRIBUTOR

@shoyer, I've collapsed all the commits to a single commit. At this point I would say this meets the scope of this PR and it should be ready to merge once CI finishes its check and it passes. Please feel free to take another pass on the code for changes that you would like to see made.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286636632 https://github.com/pydata/xarray/pull/1198#issuecomment-286636632 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjYzNjYzMg== pwolfram 4295853 2017-03-15T04:17:01Z 2017-03-15T04:17:01Z CONTRIBUTOR

I obviously need to squash prior to the merge, however...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286636605 https://github.com/pydata/xarray/pull/1198#issuecomment-286636605 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4NjYzNjYwNQ== pwolfram 4295853 2017-03-15T04:16:46Z 2017-03-15T04:16:46Z CONTRIBUTOR

@shoyer, this should essentially be ready to go now with the exception that I didn't come to a clean resolution on testing the too many open files issue via resource. Thoughts on this issue?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
286517016 https://github.com/pydata/xarray/pull/924#issuecomment-286517016 https://api.github.com/repos/pydata/xarray/issues/924 MDEyOklzc3VlQ29tbWVudDI4NjUxNzAxNg== pwolfram 4295853 2017-03-14T18:30:00Z 2017-03-14T18:30:00Z CONTRIBUTOR

@RafalSkolasinski and @shoyer, can I please get an update on this PR? This is something we need sometime soon too (cc @milenaveneziani).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  WIP: progress toward making groupby work with multiple arguments 168272291
280654955 https://github.com/pydata/xarray/pull/1198#issuecomment-280654955 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI4MDY1NDk1NQ== pwolfram 4295853 2017-02-17T13:53:09Z 2017-02-17T13:53:09Z CONTRIBUTOR

@shoyer, I'd like to see this code integrated if possible. Should we just disable autoclose for h5netcdf and merge?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
278987344 https://github.com/pydata/xarray/issues/1257#issuecomment-278987344 https://api.github.com/repos/pydata/xarray/issues/1257 MDEyOklzc3VlQ29tbWVudDI3ODk4NzM0NA== pwolfram 4295853 2017-02-10T16:15:22Z 2017-02-10T16:15:22Z CONTRIBUTOR

We would also benefit from this specifically for #1198 :+1:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  PERF: Add benchmarking? 206632333
278704180 https://github.com/pydata/xarray/pull/1198#issuecomment-278704180 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3ODcwNDE4MA== pwolfram 4295853 2017-02-09T17:00:50Z 2017-02-09T17:00:50Z CONTRIBUTOR

@shoyer, I'll leave the ball in your court for now but am happy to take another look before we go the disabling autoclose route for h5netcdf. Please let me know when/if you'd like me to take another look.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
278691935 https://github.com/pydata/xarray/pull/1198#issuecomment-278691935 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3ODY5MTkzNQ== pwolfram 4295853 2017-02-09T16:21:38Z 2017-02-09T16:21:38Z CONTRIBUTOR

@shoyer, it is quite possible the error is on my end because the error message implies that modification of the data prior to closing it may be the issue. However, I'm not getting that type of an error following a similar philosophy for the other backends, which is confusing.

Do you have advice on hunting this bug further? You obviously have more experience with h5py and h5netcdf than I do.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
277498382 https://github.com/pydata/xarray/pull/1198#issuecomment-277498382 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3NzQ5ODM4Mg== pwolfram 4295853 2017-02-05T05:20:22Z 2017-02-05T05:20:22Z CONTRIBUTOR

@shoyer, the pushed code represents my progress. The initial PR had a bug-- essentially a calculation couldn't be performed following the load. This fixes that bug and provides a test to ensure that this doesn't happen. However, I'm having trouble with h5netcdf, which I'm not very familiar with compared to netcdf. This represents my current progress, I just need some more time (or even inspiration from you) to sort out this last key issue...

I'm getting the following error:

```bash ================================================================================================================== FAILURES ================================================================================================================== _____________ OpenMFDatasetTest.test_4_open_large_num_files_h5netcdf _____________

self = <xarray.tests.test_backends.OpenMFDatasetTest testMethod=test_4_open_large_num_files_h5netcdf>

@requires_dask
@requires_h5netcdf
def test_4_open_large_num_files_h5netcdf(self):
  self.validate_open_mfdataset_large_num_files(engine=['h5netcdf'])

xarray/tests/test_backends.py:1040:


xarray/tests/test_backends.py:1018: in validate_open_mfdataset_large_num_files self.assertClose(ds.foo.sum().values, np.sum(randdata)) xarray/core/dataarray.py:400: in values return self.variable.values xarray/core/variable.py:306: in values return as_array_or_item(self._data) xarray/core/variable.py:182: in _as_array_or_item data = np.asarray(data) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/numpy/core/numeric.py:482: in asarray return array(a, dtype, copy=False, order=order) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/array/core.py:1025: in __array__ x = self.compute() ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/base.py:79: in compute return compute(self, kwargs)[0] ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/base.py:179: in compute results = get(dsk, keys, kwargs) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:537: in get_sync raise_on_exception=True, kwargs) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:500: in get_async fire_task() ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:476: in fire_task callback=queue.put) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:525: in apply_sync res = func(*args, kwds) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:268: in execute_task result = _execute_task(task, data) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:248: in _execute_task args2 = [_execute_task(a, cache) for a in args] ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:248: in <listcomp> args2 = [_execute_task(a, cache) for a in args] ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:245: in _execute_task return [_execute_task(a, cache) for a in arg] ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:245: in <listcomp> return [_execute_task(a, cache) for a in arg] ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:249: in _execute_task return func(*args2) ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/array/core.py:52: in getarray c = a[b] xarray/core/indexing.py:401: in getitem return type(self)(self.array[key]) xarray/core/indexing.py:376: in getitem return type(self)(self.array, self._updated_key(key)) xarray/core/indexing.py:354: in _updated_key for size, k in zip(self.array.shape, self.key): xarray/core/indexing.py:364: in shape for size, k in zip(self.array.shape, self.key): xarray/core/utils.py:414: in shape return self.array.shape xarray/backends/netCDF4.py:37: in getattr return getattr(self.datastore.ds.variables[self.var], attr) ../../anaconda/envs/test_env_xarray35/lib/python3.5/contextlib.py:66: in exit next(self.gen) xarray/backends/h5netcdf_.py:105: in ensure_open self.close() xarray/backends/h5netcdf_.py:190: in close close_ds(self.ds) xarray/backends/h5netcdf.py:70: in _close_ds find_root(ds).close() ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/h5netcdf/core.py:458: in close self._h5file.close() ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/h5py/_hl/files.py:302: in close self.id.close() h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2840) ??? h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2798) ??? h5py/h5f.pyx:282: in h5py.h5f.FileID.close (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/h5f.c:3905) ??? h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2840) ??? h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2798) ???


??? E RuntimeError: dictionary changed size during iteration

h5py/_objects.pyx:119: RuntimeError ============================================================================================ 1 failed, 1415 passed, 95 skipped in 116.54 seconds ============================================================================================= Exception ignored in: <function WeakValueDictionary.__init__.\<locals>.remove at 0x10f16e598> Traceback (most recent call last): File "/Users/pwolfram/anaconda/envs/test_env_xarray35/lib/python3.5/weakref.py", line 117, in remove TypeError: 'NoneType' object is not callable ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
277028448 https://github.com/pydata/xarray/pull/1198#issuecomment-277028448 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3NzAyODQ0OA== pwolfram 4295853 2017-02-02T17:42:41Z 2017-02-02T17:42:41Z CONTRIBUTOR

There are still a few more issues that need ironed out. I'll let you know when I've resolved them.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
276474582 https://github.com/pydata/xarray/pull/1198#issuecomment-276474582 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3NjQ3NDU4Mg== pwolfram 4295853 2017-01-31T19:59:30Z 2017-01-31T19:59:30Z CONTRIBUTOR

@shoyer and @PeterDSteinberg I've updated this PR to reflect requested changes.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
275254097 https://github.com/pydata/xarray/pull/1198#issuecomment-275254097 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3NTI1NDA5Nw== pwolfram 4295853 2017-01-25T22:32:17Z 2017-01-25T22:32:17Z CONTRIBUTOR

@PeterDSteinberg, did this PR fix the issue for you? I obviously need to update it but just wanted to confirm that the current branch resolved the too-many-open files error issue. Also, do you have any idea of the performance impact of these changes I'm proposing?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
272323006 https://github.com/pydata/xarray/pull/1198#issuecomment-272323006 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3MjMyMzAwNg== pwolfram 4295853 2017-01-13T00:03:24Z 2017-01-13T00:03:24Z CONTRIBUTOR

Thanks @shoyer. This makes sense. I think the path forward on the next round of edits should include making sure existing tests using open_mfdataset use both options to autoclose. If we do this we could future-proof ourselves against loss due to accidental breaking of this new functionality and avoid potentially contaminating existing workflows via performance concerns.

Documentation is also obviously required.

FYI as a heads up, I probably won't be able to get to this mid-week at the earliest but it appears we are close to a viable solution.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
272075886 https://github.com/pydata/xarray/pull/1198#issuecomment-272075886 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3MjA3NTg4Ng== pwolfram 4295853 2017-01-12T04:54:27Z 2017-01-12T04:54:27Z CONTRIBUTOR

@shoyer, I just realized this might conflict with #1087. Do you foresee this causing problems and what order do you plan to merge this PR and #1087 (which obviously predates this one...)? We are running into the snag with #463 in our analysis and my personal preference would be to get some type of solution into place sooner than later. Thanks for considering this request.

Also, I'm not sure exactly the best way to test performance either. Could we potentially use something like the "toy" test cases for this purpose? Ideally we would have a test case with O(100) files to gain a clearer picture of the performance cost of this PR.

Please let me know what you want me to do with this PR-- should I clean it up in anticipation of a merge or just wait for now to see if there are extra things that need fixed via additional testing? Note I have the full scipy, h5netcdf and pynio implementations that can also be reviewed because they weren't available when you did your review yesterday.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
271961921 https://github.com/pydata/xarray/pull/1198#issuecomment-271961921 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3MTk2MTkyMQ== pwolfram 4295853 2017-01-11T19:01:42Z 2017-01-11T19:01:42Z CONTRIBUTOR

Thanks @shoyer, Does that mean if the checks pass the code is at least minimally correct in terms of not breaking previous design choices? E.g., does this imply that we are ok except for cleanup / implementation details on this PR?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
271957327 https://github.com/pydata/xarray/pull/1198#issuecomment-271957327 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3MTk1NzMyNw== pwolfram 4295853 2017-01-11T18:44:51Z 2017-01-11T18:44:51Z CONTRIBUTOR

@shoyer, all the checks "pass" but there are still errors in the "allowed" list. If you get a change could you please provide me some perspective on whether these are errors on my end or not? I'm not exactly sure how to interpret them.

Once I know I have correctness in this code I plan to fix the inlines you graciously highlighted above. I think we are getting close here, assuming that I have enough testing to demonstrate we have accurately fixed the too many open file issue. Any additional ideas you have for tests would be really helpful too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
271660072 https://github.com/pydata/xarray/pull/1198#issuecomment-271660072 https://api.github.com/repos/pydata/xarray/issues/1198 MDEyOklzc3VlQ29tbWVudDI3MTY2MDA3Mg== pwolfram 4295853 2017-01-10T18:41:36Z 2017-01-10T18:41:36Z CONTRIBUTOR

The intent of this PR is to address (or at least partially address) the following issues:

  • https://github.com/pydata/xarray/issues/463
  • https://github.com/pydata/xarray/issues/798
  • https://github.com/CCI-Tools/cate-core/issues/102
  • https://github.com/MPAS-Dev/MPAS-Analysis/issues/49
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Fixes OS error arising from too many files open 199900056
268359031 https://github.com/pydata/xarray/issues/1173#issuecomment-268359031 https://api.github.com/repos/pydata/xarray/issues/1173 MDEyOklzc3VlQ29tbWVudDI2ODM1OTAzMQ== pwolfram 4295853 2016-12-20T21:03:31Z 2016-12-20T21:03:31Z CONTRIBUTOR

@JoyMonteiro and @shoyer, as I've been thinking about this more and especially regarding #463, I was planning on building on opener from #1128 to essentially open, read, and then close a file each time a read get operation was needed on a newCDF file. My initial view was that output fundamentally would be serial but as @JoyMonteiro points out, there may be a benefit to making a provision for parallel output. However, we will probably run into the same netCDF limitation on the number of open files. Would we want similar functionality on opener for set as well as the get methods? I'm not sure how something like sync would work in this context and suspect this could lead to problems. Presumably we would be requiring writing each dimension, attribute, variable, etc at each call with its own associated open, write, and close. I obviously need to find the time to dig into this further...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Some queries 196541604
266036302 https://github.com/pydata/xarray/pull/1153#issuecomment-266036302 https://api.github.com/repos/pydata/xarray/issues/1153 MDEyOklzc3VlQ29tbWVudDI2NjAzNjMwMg== pwolfram 4295853 2016-12-09T15:09:52Z 2016-12-09T15:09:52Z CONTRIBUTOR

@shoyer, I think this is really great and appreciate you doing this awesome work. I obviously have a vested interest in the drop=True capability because we have found great use for it via where. My vote is go for it!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add drop=True argument to isel, sel and squeeze 193467825
263723460 https://github.com/pydata/xarray/issues/463#issuecomment-263723460 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI2MzcyMzQ2MA== pwolfram 4295853 2016-11-29T22:39:25Z 2016-11-29T23:30:59Z CONTRIBUTOR

I just realized I didn't say thank you to @shoyer et al for the advice and help. Please forgive my rudeness.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498
263721589 https://github.com/pydata/xarray/issues/463#issuecomment-263721589 https://api.github.com/repos/pydata/xarray/issues/463 MDEyOklzc3VlQ29tbWVudDI2MzcyMTU4OQ== pwolfram 4295853 2016-11-29T22:31:25Z 2016-11-29T22:31:25Z CONTRIBUTOR

@shoyer, if I understand correctly the best approach as you see it to build on opener via #1128, recognizing this will be essentially "upgraded" sometime in the future, right?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset too many files 94328498

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 35.988ms · About: xarray-datasette