home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 143222580

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/463#issuecomment-143222580 https://api.github.com/repos/pydata/xarray/issues/463 143222580 MDEyOklzc3VlQ29tbWVudDE0MzIyMjU4MA== 380927 2015-09-25T13:27:59Z 2015-09-25T13:27:59Z NONE

I've pushed a few commits trying this out to https://github.com/cpaulik/xray/tree/closing_netcdf_backend . I can open a WIP PR if this would be easier to discuss there.

There are however a few tests that keep failing and I can not figure out why.

e.g.: test_backends.py::NetCDF4ViaDaskDataTest::test_compression_encoding:

If I set a breakpoint at line 941 of dataset.py and just continue the test fails.

If I however evaluate self.variables.items() or even self.variables at the breakpoint I get the correct output and the test passes when continued. I can not really see the difference between me evaluating this in ipdb and the code that is on the line.

The error I get when running the test without interference is:

``` shell test_backends.py::NetCDF4ViaDaskDataTest::test_compression_encoding FAILED

====================================================== FAILURES ======================================================= ______ NetCDF4ViaDaskDataTest.test_compression_encoding _________

self = <xray.test.test_backends.NetCDF4ViaDaskDataTest testMethod=test_compression_encoding>

def test_compression_encoding(self):
    data = create_test_data()
    data['var2'].encoding.update({'zlib': True,
                                  'chunksizes': (5, 5),
                                  'fletcher32': True})
  with self.roundtrip(data) as actual:

test_backends.py:502:


/usr/lib/python2.7/contextlib.py:17: in enter return self.gen.next() test_backends.py:596: in roundtrip yield ds.chunk() ../core/dataset.py:942: in chunk for k, v in self.variables.items()]) ../core/dataset.py:935: in maybe_chunk token2 = tokenize(name, token if token else var._data) /home/cpa/.virtualenvs/xray/local/lib/python2.7/site-packages/dask/base.py:152: in tokenize return md5(str(tuple(map(normalize_token, args))).encode()).hexdigest() ../core/indexing.py:301: in repr (type(self).name, self.array, self.key)) ../core/utils.py:377: in repr return '%s(array=%r)' % (type(self).name, self.array) ../core/indexing.py:301: in repr (type(self).name, self.array, self.key)) ../core/utils.py:377: in repr return '%s(array=%r)' % (type(self).name, self.array) netCDF4/_netCDF4.pyx:2931: in netCDF4._netCDF4.Variable.repr (netCDF4/_netCDF4.c:25068) ??? netCDF4/_netCDF4.pyx:2938: in netCDF4._netCDF4.Variable.unicode (netCDF4/_netCDF4.c:25243) ??? netCDF4/_netCDF4.pyx:3059: in netCDF4._netCDF4.Variable.dimensions.get (netCDF4/_netCDF4.c:27486) ???


??? E RuntimeError: NetCDF: Not a valid ID

netCDF4/_netCDF4.pyx:2994: RuntimeError ============================================== 1 failed in 0.50 seconds =============================================== ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  94328498
Powered by Datasette · Queries took 0.89ms · About: xarray-datasette