home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 203159853

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
203159853 MDU6SXNzdWUyMDMxNTk4NTM= 1229 opening NetCDF file fails with ValueError when time variable is multidimensional 500246 closed 0     3 2017-01-25T16:56:27Z 2017-01-26T05:13:12Z 2017-01-26T05:13:12Z CONTRIBUTOR      

I have a NetCDF file that includes a time field with multiple dimensions. This leads to a failure in xarray.open_dataset, because first_n_items returns an object with shape (1,), but last_item returns an object with shape (1,)*ndim where ndim is the number of dimensions for the time variable. See the illustration below:

``` In [748]: ds = netCDF4.Dataset("test.nc", "w")

In [749]: dim = ds.createDimension("dim", 5)

In [750]: dim2 = ds.createDimension("dim2", 5)

In [751]: time = ds.createVariable("time", "u4", ("dim", "dim2"))

In [752]: time.units = "seconds since 1970-01-01"

In [753]: time.calendar = "gregorian"

In [754]: time[:, :] = arange(25).reshape(5, 5)

In [755]: ds.close()

In [757]: xarray.open_dataset("test.nc")

ValueError Traceback (most recent call last) <ipython-input-757-17ad46b81538> in <module>() ----> 1 xarray.open_dataset("test.nc")

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables) 300 lock = _default_lock(filename_or_obj, engine) 301 with close_on_error(store): --> 302 return maybe_decode_store(store, lock) 303 else: 304 if engine is not None and engine != 'scipy':

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 221 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 222 concat_characters=concat_characters, decode_coords=decode_coords, --> 223 drop_variables=drop_variables) 224 225 _protect_dataset_variables_inplace(ds, cache)

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 947 vars, attrs, coord_names = decode_cf_variables( 948 vars, attrs, concat_characters, mask_and_scale, decode_times, --> 949 decode_coords, drop_variables=drop_variables) 950 ds = Dataset(vars, attrs=attrs) 951 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars))

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 882 new_vars[k] = decode_cf_variable( 883 v, concat_characters=concat, mask_and_scale=mask_and_scale, --> 884 decode_times=decode_times) 885 if decode_coords: 886 var_attrs = new_vars[k].attrs

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variable(var, concat_characters, mask_and_scale, decode_times, decode_endianness) 819 units = pop_to(attributes, encoding, 'units') 820 calendar = pop_to(attributes, encoding, 'calendar') --> 821 data = DecodedCFDatetimeArray(data, units, calendar) 822 elif attributes['units'] in TIME_UNITS: 823 # timedelta

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in init(self, array, units, calendar) 384 # Dataset.repr when users try to view their lazily decoded array. 385 example_value = np.concatenate([first_n_items(array, 1) or [0], --> 386 last_item(array) or [0]]) 387 388 try:

ValueError: all the input arrays must have same number of dimensions ```

Closer look in the debugger:

``` In [758]: %debug xarray.open_dataset("test.nc") NOTE: Enter 'c' at the ipdb> prompt to continue execution.

<string>(1)<module>()

ipdb> break /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 Breakpoint 1 at /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 ipdb> cont

/dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py(385)init() 383 # successfully. Otherwise, tracebacks end up swallowed by 384 # Dataset.repr when users try to view their lazily decoded array. 1-> 385 example_value = np.concatenate([first_n_items(array, 1) or [0], 386 last_item(array) or [0]]) 387

ipdb> p first_n_items(array, 1).shape (1,) ipdb> p last_item(array).shape (1, 1) ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1229/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 1 row from issues_id in issues_labels
  • 3 rows from issue in issue_comments
Powered by Datasette · Queries took 0.573ms · About: xarray-datasette