html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/pull/1198#issuecomment-288832196,https://api.github.com/repos/pydata/xarray/issues/1198,288832196,MDEyOklzc3VlQ29tbWVudDI4ODgzMjE5Ng==,1217238,2017-03-23T19:19:59Z,2017-03-23T19:19:59Z,MEMBER,"OK, in it goes!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-288559417,https://api.github.com/repos/pydata/xarray/issues/1198,288559417,MDEyOklzc3VlQ29tbWVudDI4ODU1OTQxNw==,1217238,2017-03-22T22:27:36Z,2017-03-22T22:27:36Z,MEMBER,"> @shoyer, if we generally cover test_backends for autoclose=True, then we should get the pickle testing for free Agreed, that should do it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-288513495,https://api.github.com/repos/pydata/xarray/issues/1198,288513495,MDEyOklzc3VlQ29tbWVudDI4ODUxMzQ5NQ==,1217238,2017-03-22T19:32:53Z,2017-03-22T19:32:53Z,MEMBER,"Subclasses also work in place of fixtures in many cases (we use them in much of the existing code). On Wed, Mar 22, 2017 at 12:30 PM Phillip Wolfram wrote: > *@pwolfram* commented on this pull request. > ------------------------------ > > In xarray/tests/test_backends.py > : > > > > with self.assertRaisesRegexp(IOError, 'no files to open'): > - open_mfdataset('foo-bar-baz-*.nc') > + for close in [True, False]: > > I think I understand how to use fixtures for both of these arguments now... > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-288416595,https://api.github.com/repos/pydata/xarray/issues/1198,288416595,MDEyOklzc3VlQ29tbWVudDI4ODQxNjU5NQ==,306380,2017-03-22T14:30:38Z,2017-03-22T14:30:38Z,MEMBER,Just seeing this now. I'm very glad to see progress here. Thanks @pwolfram ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-288127957,https://api.github.com/repos/pydata/xarray/issues/1198,288127957,MDEyOklzc3VlQ29tbWVudDI4ODEyNzk1Nw==,1217238,2017-03-21T16:04:56Z,2017-03-21T16:04:56Z,MEMBER,"@pwolfram no worries, I am not concerned at all about credit for my commit :). It's possible to do gymnastics with git rebase to preserve history in cases like this but it's questionable if it's worth the trouble.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-287113380,https://api.github.com/repos/pydata/xarray/issues/1198,287113380,MDEyOklzc3VlQ29tbWVudDI4NzExMzM4MA==,1217238,2017-03-16T16:28:58Z,2017-03-16T16:28:58Z,MEMBER,"I restarted all the timed out tests on Travis, hopefully they pass now!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-278701754,https://api.github.com/repos/pydata/xarray/issues/1198,278701754,MDEyOklzc3VlQ29tbWVudDI3ODcwMTc1NA==,1217238,2017-02-09T16:52:53Z,2017-02-09T16:52:53Z,MEMBER,"I'm still working on it. A possible culprit is some sort of dangling reference that stops the file from being actually closed by h5py. If we can't figure it out in a few days, we can just disable autoclose for h5netcdf.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-278552122,https://api.github.com/repos/pydata/xarray/issues/1198,278552122,MDEyOklzc3VlQ29tbWVudDI3ODU1MjEyMg==,1217238,2017-02-09T05:36:41Z,2017-02-09T05:36:41Z,MEMBER,"I can reproduce issues with h5netcdf on my machine. Of course, I get slightly different error messages -- either an error about closing a closed file from h5py or a segfault. So far, I've verified that it isn't a multi-threading issue -- I get the same errors when dask is run in single-threaded mode (`dask.set_options(get=dask.async.get_sync`). That's not encouraging, suggesting that this may be a legitimate h5netcdf or (perhaps more likely) h5py bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-277504185,https://api.github.com/repos/pydata/xarray/issues/1198,277504185,MDEyOklzc3VlQ29tbWVudDI3NzUwNDE4NQ==,1217238,2017-02-05T08:14:51Z,2017-02-05T08:14:51Z,MEMBER,"I'll take a look tomorrow. Getting all these backends to behave correctly and consistently is a constant battle. On Sat, Feb 4, 2017 at 9:20 PM Phillip Wolfram wrote: > @shoyer , the pushed code represents my > progress. The initial PR had a bug-- essentially a calculation couldn't be > performed following the load. This fixes that bug and provides a test to > ensure that this doesn't happen. However, I'm having trouble with h5netcdf, > which I'm not very familiar with compared to netcdf. This represents my > current progress, I just need some more time (or even inspiration from you) > to sort out this last key issue... > > I'm getting the following error: > > ================================================================================================================== FAILURES ================================================================================================================== > ___________________________________________________________________________________________ OpenMFDatasetTest.test_4_open_large_num_files_h5netcdf ___________________________________________________________________________________________ > > self = > > @requires_dask > @requires_h5netcdf > def test_4_open_large_num_files_h5netcdf(self):> self.validate_open_mfdataset_large_num_files(engine=['h5netcdf']) > > xarray/tests/test_backends.py:1040: > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > xarray/tests/test_backends.py:1018: in validate_open_mfdataset_large_num_files > self.assertClose(ds.foo.sum().values, np.sum(randdata)) > xarray/core/dataarray.py:400: in values > return self.variable.values > xarray/core/variable.py:306: in values > return _as_array_or_item(self._data) > xarray/core/variable.py:182: in _as_array_or_item > data = np.asarray(data) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/numpy/core/numeric.py:482: in asarray > return array(a, dtype, copy=False, order=order) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/array/core.py:1025: in __array__ > x = self.compute() > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/base.py:79: in compute > return compute(self, **kwargs)[0] > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/base.py:179: in compute > results = get(dsk, keys, **kwargs) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:537: in get_sync > raise_on_exception=True, **kwargs) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:500: in get_async > fire_task() > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:476: in fire_task > callback=queue.put) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:525: in apply_sync > res = func(*args, **kwds) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:268: in execute_task > result = _execute_task(task, data) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:248: in _execute_task > args2 = [_execute_task(a, cache) for a in args] > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:248: in > args2 = [_execute_task(a, cache) for a in args] > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:245: in _execute_task > return [_execute_task(a, cache) for a in arg] > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:245: in > return [_execute_task(a, cache) for a in arg] > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/async.py:249: in _execute_task > return func(*args2) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/dask/array/core.py:52: in getarray > c = a[b] > xarray/core/indexing.py:401: in __getitem__ > return type(self)(self.array[key]) > xarray/core/indexing.py:376: in __getitem__ > return type(self)(self.array, self._updated_key(key)) > xarray/core/indexing.py:354: in _updated_key > for size, k in zip(self.array.shape, self.key): > xarray/core/indexing.py:364: in shape > for size, k in zip(self.array.shape, self.key): > xarray/core/utils.py:414: in shape > return self.array.shape > xarray/backends/netCDF4_.py:37: in __getattr__ > return getattr(self.datastore.ds.variables[self.var], attr) > ../../anaconda/envs/test_env_xarray35/lib/python3.5/contextlib.py:66: in __exit__ > next(self.gen) > xarray/backends/h5netcdf_.py:105: in ensure_open > self.close() > xarray/backends/h5netcdf_.py:190: in close > _close_ds(self.ds) > xarray/backends/h5netcdf_.py:70: in _close_ds > find_root(ds).close() > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/h5netcdf/core.py:458: in close > self._h5file.close() > ../../anaconda/envs/test_env_xarray35/lib/python3.5/site-packages/h5py/_hl/files.py:302: in close > self.id.close() > h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2840) > ??? > h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2798) > ??? > h5py/h5f.pyx:282: in h5py.h5f.FileID.close (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/h5f.c:3905) > ??? > h5py/_objects.pyx:54: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2840) > ??? > h5py/_objects.pyx:55: in h5py._objects.with_phil.wrapper (/Users/travis/miniconda3/conda-bld/work/h5py-2.6.0/h5py/_objects.c:2798) > ??? > _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > > ??? > E RuntimeError: dictionary changed size during iteration > > h5py/_objects.pyx:119: RuntimeError > ============================================================================================ 1 failed, 1415 passed, 95 skipped in 116.54 seconds ============================================================================================= > Exception ignored in: .remove at 0x10f16e598> > Traceback (most recent call last): > File ""/Users/pwolfram/anaconda/envs/test_env_xarray35/lib/python3.5/weakref.py"", line 117, in remove > TypeError: 'NoneType' object is not callable > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-277400963,https://api.github.com/repos/pydata/xarray/issues/1198,277400963,MDEyOklzc3VlQ29tbWVudDI3NzQwMDk2Mw==,1217238,2017-02-04T00:36:15Z,2017-02-04T00:36:15Z,MEMBER,@pwolfram this looks pretty close to me now -- let me know when it's ready for review.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-272252623,https://api.github.com/repos/pydata/xarray/issues/1198,272252623,MDEyOklzc3VlQ29tbWVudDI3MjI1MjYyMw==,1217238,2017-01-12T19:08:33Z,2017-01-12T19:08:33Z,MEMBER,"This should be totally fine without performance or compatibility concerns as long as we set `autoclose=False` by the default. In the long term, it would be nice to handle autoclosing automatically (invoking it when the number of open files exceeds some limit), but we should probably be a little more clever for that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-272077913,https://api.github.com/repos/pydata/xarray/issues/1198,272077913,MDEyOklzc3VlQ29tbWVudDI3MjA3NzkxMw==,1217238,2017-01-12T05:12:53Z,2017-01-12T05:12:53Z,MEMBER,"Don't worry about #1087 -- I can rebase it. On Wed, Jan 11, 2017 at 8:54 PM Phillip Wolfram wrote: > @shoyer , I just realized this might conflict > with #1087 . Do you foresee > this causing problems and what order do you plan to merge this PR and > #1087 (which obviously > predates this one...)? We are running into the snag with #463 > in our analysis and my > personal preference would be to get some type of solution into place sooner > than later. Thanks for considering this request. > > Also, I'm not sure exactly the best way to test performance either. Could > we potentially use something like the ""toy"" test cases for this purpose? > Ideally we would have a test case with O(100) files to gain a clearer > picture of the performance cost of this PR. > > Please let me know what you want me to do with this PR-- should I clean it > up in anticipation of a merge or just wait for now to see if there are > extra things that need fixed via additional testing? Note I have the full > scipy, h5netcdf and pynio implementations that can also be reviewed because > they weren't available when you did your review yesterday. > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-271996986,https://api.github.com/repos/pydata/xarray/issues/1198,271996986,MDEyOklzc3VlQ29tbWVudDI3MTk5Njk4Ng==,1217238,2017-01-11T21:16:35Z,2017-01-11T21:16:35Z,MEMBER,"> Does that mean if the checks pass the code is at least minimally correct in terms of not breaking previous design choices? E.g., does this imply that we are ok except for cleanup / implementation details on this PR? If the checks pass, it means that it doesn't directly break anything that we have tests for. Which should cover *most* functionality. However, we'll still need to be careful not to introduce performance regressions -- we don't have any automated performance tests yet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056 https://github.com/pydata/xarray/pull/1198#issuecomment-271960859,https://api.github.com/repos/pydata/xarray/issues/1198,271960859,MDEyOklzc3VlQ29tbWVudDI3MTk2MDg1OQ==,1217238,2017-01-11T18:57:58Z,2017-01-11T18:57:58Z,MEMBER,"@pwolfram the allowed failures are pre-existing, not related to this change.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,199900056