id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 714844298,MDExOlB1bGxSZXF1ZXN0NDk3ODU3MTA0,4485,Handle scale_factor and add_offset as scalar,500246,closed,0,,,3,2020-10-05T13:31:36Z,2020-10-16T21:20:14Z,2020-10-11T20:06:33Z,CONTRIBUTOR,,0,pydata/xarray/pulls/4485,"The h5netcdf engine exposes single-valued attributes as arrays of shape (1,), which is correct according to the NetCDF standard, but may cause a problem when reading a value of shape () before the scale_factor and add_offset have been applied. This PR adds a check for the dimensionality of add_offset and scale_factor and ensures they are scalar before they are used for further processing, adds a unit test to verify that this works correctly, and a note to the documentation to warn users of this difference between the h5netcdf and netcdf4 engines. - [x] Closes #4471 - [x] Tests added - [x] Passes `isort . && black . && mypy . && flake8` - [x] User visible changes (including notable bug fixes) are documented in `whats-new.rst` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/4485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 203159853,MDU6SXNzdWUyMDMxNTk4NTM=,1229,opening NetCDF file fails with ValueError when time variable is multidimensional,500246,closed,0,,,3,2017-01-25T16:56:27Z,2017-01-26T05:13:12Z,2017-01-26T05:13:12Z,CONTRIBUTOR,,,,"I have a NetCDF file that includes a time field with multiple dimensions. This leads to a failure in `xarray.open_dataset`, because `first_n_items` returns an object with shape `(1,)`, but `last_item` returns an object with shape `(1,)*ndim` where `ndim` is the number of dimensions for the time variable. See the illustration below: ``` In [748]: ds = netCDF4.Dataset(""test.nc"", ""w"") In [749]: dim = ds.createDimension(""dim"", 5) In [750]: dim2 = ds.createDimension(""dim2"", 5) In [751]: time = ds.createVariable(""time"", ""u4"", (""dim"", ""dim2"")) In [752]: time.units = ""seconds since 1970-01-01"" In [753]: time.calendar = ""gregorian"" In [754]: time[:, :] = arange(25).reshape(5, 5) In [755]: ds.close() In [757]: xarray.open_dataset(""test.nc"") --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () ----> 1 xarray.open_dataset(""test.nc"") /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables) 300 lock = _default_lock(filename_or_obj, engine) 301 with close_on_error(store): --> 302 return maybe_decode_store(store, lock) 303 else: 304 if engine is not None and engine != 'scipy': /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 221 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 222 concat_characters=concat_characters, decode_coords=decode_coords, --> 223 drop_variables=drop_variables) 224 225 _protect_dataset_variables_inplace(ds, cache) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 947 vars, attrs, coord_names = decode_cf_variables( 948 vars, attrs, concat_characters, mask_and_scale, decode_times, --> 949 decode_coords, drop_variables=drop_variables) 950 ds = Dataset(vars, attrs=attrs) 951 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 882 new_vars[k] = decode_cf_variable( 883 v, concat_characters=concat, mask_and_scale=mask_and_scale, --> 884 decode_times=decode_times) 885 if decode_coords: 886 var_attrs = new_vars[k].attrs /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in decode_cf_variable(var, concat_characters, mask_and_scale, decode_times, decode_endianness) 819 units = pop_to(attributes, encoding, 'units') 820 calendar = pop_to(attributes, encoding, 'calendar') --> 821 data = DecodedCFDatetimeArray(data, units, calendar) 822 elif attributes['units'] in TIME_UNITS: 823 # timedelta /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py in __init__(self, array, units, calendar) 384 # Dataset.__repr__ when users try to view their lazily decoded array. 385 example_value = np.concatenate([first_n_items(array, 1) or [0], --> 386 last_item(array) or [0]]) 387 388 try: ValueError: all the input arrays must have same number of dimensions ``` Closer look in the debugger: ``` In [758]: %debug xarray.open_dataset(""test.nc"") NOTE: Enter 'c' at the ipdb> prompt to continue execution. > (1)() ipdb> break /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 Breakpoint 1 at /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py:385 ipdb> cont > /dev/shm/gerrit/venv/stable-3.5/lib/python3.5/site-packages/xarray/conventions.py(385)__init__() 383 # successfully. Otherwise, tracebacks end up swallowed by 384 # Dataset.__repr__ when users try to view their lazily decoded array. 1-> 385 example_value = np.concatenate([first_n_items(array, 1) or [0], 386 last_item(array) or [0]]) 387 ipdb> p first_n_items(array, 1).shape (1,) ipdb> p last_item(array).shape (1, 1) ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue