id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 281423161,MDExOlB1bGxSZXF1ZXN0MTU3ODU2NTEx,1776,[WIP] Fix pydap array wrapper,6815844,closed,0,,3008859,6,2017-12-12T15:22:07Z,2019-09-25T15:44:19Z,2018-01-09T01:48:13Z,MEMBER,,0,pydata/xarray/pulls/1776," - [x] Closes #1775 (remove if there is no corresponding issue, which should only be the case for minor changes) - [x] Tests added (for all bug fixes or enhancements) - [x] Tests passed (for all non-documentation changes) - [x] Passes ``git diff upstream/master **/*py | flake8 --diff`` (remove if you did not edit any Python files) - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later) I am trying to fix #1775, but tests are still failing. Any help would be appreciated.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1776/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 307224717,MDU6SXNzdWUzMDcyMjQ3MTc=,2002,Unexpected decoded time in xarray >= 0.10.1,9655353,closed,0,,3008859,8,2018-03-21T12:28:54Z,2018-03-31T01:16:14Z,2018-03-31T01:16:14Z,NONE,,,,"#### Problem description Given the original time dimension: ```python ds = xr.open_mfdataset(""C:\\Users\\janis\\.cate\\data_stores\\local\\local.SST_should_fail\\*.nc"", decode_cf=False) ``` ``` array([788961600, 789048000, 789134400, 789220800, 789307200, 789393600, 789480000, 789566400, 789652800, 789739200, 789825600, 789912000, 789998400, 790084800, 790171200, 790257600, 790344000, 790430400, 790516800, 790603200, 790689600, 790776000, 790862400, 790948800, 791035200, 791121600, 791208000, 791294400, 791380800, 791467200, 791553600, 791640000], dtype=int64) Coordinates: * time (time) int64 788961600 789048000 789134400 789220800 789307200 ... Attributes: standard_name: time axis: T comment: bounds: time_bnds long_name: reference time of sst file _ChunkSizes: 1 units: seconds since 1981-01-01 calendar: gregorian ``` Produces this decoded time dimension with `xarray >= 0.10.1`: ```python ds = xr.open_mfdataset(""C:\\Users\\janis\\.cate\\data_stores\\local\\local.SST_should_fail\\*.nc"", decode_cf=True) ``` ``` array(['1981-01-01T00:00:00.627867648', '1980-12-31T23:59:58.770774016', '1981-01-01T00:00:01.208647680', '1980-12-31T23:59:59.351554048', '1981-01-01T00:00:01.789427712', '1980-12-31T23:59:59.932334080', '1980-12-31T23:59:58.075240448', '1981-01-01T00:00:00.513114112', '1980-12-31T23:59:58.656020480', '1981-01-01T00:00:01.093894144', '1980-12-31T23:59:59.236800512', '1981-01-01T00:00:01.674674176', '1980-12-31T23:59:59.817580544', '1980-12-31T23:59:57.960486912', '1981-01-01T00:00:00.398360576', '1980-12-31T23:59:58.541266944', '1981-01-01T00:00:00.979140608', '1980-12-31T23:59:59.122046976', '1981-01-01T00:00:01.559920640', '1980-12-31T23:59:59.702827008', '1981-01-01T00:00:02.140700672', '1981-01-01T00:00:00.283607040', '1980-12-31T23:59:58.426513408', '1981-01-01T00:00:00.864387072', '1980-12-31T23:59:59.007293440', '1981-01-01T00:00:01.445167104', '1980-12-31T23:59:59.588073472', '1981-01-01T00:00:02.025947136', '1981-01-01T00:00:00.168853504', '1980-12-31T23:59:58.311759872', '1981-01-01T00:00:00.749633536', '1980-12-31T23:59:58.892539904'], dtype='datetime64[ns]') Coordinates: * time (time) datetime64[ns] 1981-01-01T00:00:00.627867648 ... Attributes: standard_name: time axis: T comment: bounds: time_bnds long_name: reference time of sst file _ChunkSizes: 1 ``` #### Expected Output With ``xarray == 0.10.0`` the output is as expected: ```python ds = xr.open_mfdataset(""C:\\Users\\janis\\.cate\\data_stores\\local\\local.SST_should_fail\\*.nc"", decode_cf=True) ``` ``` array(['2006-01-01T12:00:00.000000000', '2006-01-02T12:00:00.000000000', '2006-01-03T12:00:00.000000000', '2006-01-04T12:00:00.000000000', '2006-01-05T12:00:00.000000000', '2006-01-06T12:00:00.000000000', '2006-01-07T12:00:00.000000000', '2006-01-08T12:00:00.000000000', '2006-01-09T12:00:00.000000000', '2006-01-10T12:00:00.000000000', '2006-01-11T12:00:00.000000000', '2006-01-12T12:00:00.000000000', '2006-01-13T12:00:00.000000000', '2006-01-14T12:00:00.000000000', '2006-01-15T12:00:00.000000000', '2006-01-16T12:00:00.000000000', '2006-01-17T12:00:00.000000000', '2006-01-18T12:00:00.000000000', '2006-01-19T12:00:00.000000000', '2006-01-20T12:00:00.000000000', '2006-01-21T12:00:00.000000000', '2006-01-22T12:00:00.000000000', '2006-01-23T12:00:00.000000000', '2006-01-24T12:00:00.000000000', '2006-01-25T12:00:00.000000000', '2006-01-26T12:00:00.000000000', '2006-01-27T12:00:00.000000000', '2006-01-28T12:00:00.000000000', '2006-01-29T12:00:00.000000000', '2006-01-30T12:00:00.000000000', '2006-01-31T12:00:00.000000000', '2006-02-01T12:00:00.000000000'], dtype='datetime64[ns]') Coordinates: * time (time) datetime64[ns] 2006-01-01T12:00:00 2006-01-02T12:00:00 ... Attributes: standard_name: time axis: T comment: bounds: time_bnds long_name: reference time of sst file _ChunkSizes: 1 ``` #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.6.4.final.0 python-bits: 32 OS: Windows OS-release: 10 machine: AMD64 processor: Intel64 Family 6 Model 69 Stepping 1, GenuineIntel byteorder: little LC_ALL: None LANG: None LOCALE: None.None xarray: 0.10.1 pandas: 0.22.0 numpy: 1.14.2 scipy: 0.19.1 netCDF4: 1.3.1 h5netcdf: 0.5.0 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.1 distributed: 1.21.3 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: None setuptools: 39.0.1 pip: 9.0.2 conda: None pytest: 3.1.3 IPython: 6.2.1 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 283388962,MDExOlB1bGxSZXF1ZXN0MTU5Mjg2OTk0,1793,fix distributed writes,2443309,closed,0,,3008859,35,2017-12-19T22:24:41Z,2018-03-13T15:32:54Z,2018-03-10T15:43:18Z,MEMBER,,0,pydata/xarray/pulls/1793," - [x] Closes #1464 - [x] Tests added - [x] Tests passed - [x] Passes ``git diff upstream/master **/*py | flake8 --diff`` - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API Right now, I've just modified the dask distributed integration tests so we can all see the [failing tests](https://travis-ci.org/jhamman/xarray/jobs/317603224#L4400-L4571). I'm happy to push this further but I thought I'd see if either @shoyer or @mrocklin have an idea where to start?","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 287852184,MDU6SXNzdWUyODc4NTIxODQ=,1821,v0.10.1 Release,2443309,closed,0,,3008859,11,2018-01-11T16:56:08Z,2018-02-26T23:20:45Z,2018-02-26T01:48:32Z,MEMBER,,,,"We're close to a minor/bug-fix release (0.10.1). What do we need to get done before that can happen? - [x] #1800 Performance improvements to Zarr (@jhamman) - [ ] #1793 Fix for to_netcdf writes with dask-distributed (@jhamman, could use help) - [x] #1819 Normalisation for RGB imshow Help wanted / bugs that no-one is working on: - [ ] #1792 Comparison to masked numpy arrays - [ ] #1764 groupby_bins fails for empty bins What else? ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1821/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 295744504,MDU6SXNzdWUyOTU3NDQ1MDQ=,1898,zarr RTD docs broken,1197350,closed,0,,3008859,1,2018-02-09T03:35:05Z,2018-02-15T23:20:31Z,2018-02-15T23:20:31Z,MEMBER,,,,"This is what is getting rendered on RTD http://xarray.pydata.org/en/latest/io.html#zarr ``` In [26]: ds = xr.Dataset({'foo': (('x', 'y'), np.random.rand(4, 5))}, ....: coords={'x': [10, 20, 30, 40], ....: 'y': pd.date_range('2000-01-01', periods=5), ....: 'z': ('x', list('abcd'))}) ....: In [27]: ds.to_zarr('path/to/directory.zarr') --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in () ----> 1 ds.to_zarr('path/to/directory.zarr') /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/core/dataset.py in to_zarr(self, store, mode, synchronizer, group, encoding) 1165 from ..backends.api import to_zarr 1166 return to_zarr(self, store=store, mode=mode, synchronizer=synchronizer, -> 1167 group=group, encoding=encoding) 1168 1169 def __unicode__(self): /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/backends/api.py in to_zarr(dataset, store, mode, synchronizer, group, encoding) 752 # I think zarr stores should always be sync'd immediately 753 # TODO: figure out how to properly handle unlimited_dims --> 754 dataset.dump_to_store(store, sync=True, encoding=encoding) 755 return store /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/core/dataset.py in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims) 1068 1069 store.store(variables, attrs, check_encoding, -> 1070 unlimited_dims=unlimited_dims) 1071 if sync: 1072 store.sync() /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/backends/zarr.py in store(self, variables, attributes, *args, **kwargs) 378 def store(self, variables, attributes, *args, **kwargs): 379 AbstractWritableDataStore.store(self, variables, attributes, --> 380 *args, **kwargs) 381 382 /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/backends/common.py in store(self, variables, attributes, check_encoding_set, unlimited_dims) 275 variables, attributes = self.encode(variables, attributes) 276 --> 277 self.set_attributes(attributes) 278 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 279 self.set_variables(variables, check_encoding_set, /home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.5/site-packages/xarray-0.10.0+dev55.g1d32399-py3.5.egg/xarray/backends/zarr.py in set_attributes(self, attributes) 341 342 def set_attributes(self, attributes): --> 343 self.ds.attrs.put(attributes) 344 345 def encode_variable(self, variable): AttributeError: 'Attributes' object has no attribute 'put' ```","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1898/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 284607311,MDExOlB1bGxSZXF1ZXN0MTYwMTY1NjI3,1800,WIP: Performance improvements for zarr backend,2443309,closed,0,,3008859,6,2017-12-26T20:37:45Z,2018-01-24T14:56:57Z,2018-01-24T14:55:52Z,MEMBER,,0,pydata/xarray/pulls/1800," - [x] Closes #https://github.com/pangeo-data/pangeo/issues/48 - [x] Tests added (for all bug fixes or enhancements) - [x] Tests passed (for all non-documentation changes) - [x] Passes ``git diff upstream/master **/*py | flake8 --diff`` (remove if you did not edit any Python files) - [x] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later) This is building on top of #1799. Based on the suggestion from @alimanfoo in https://github.com/pangeo-data/pangeo/issues/48#issuecomment-353807691, I have reworked the handling of attributes in the zarr backend. There is more to do here, particularly in the `set_dimensions` arena but this is giving almost a 2x speedup in writing to GCP. cc @rabernat, @mrocklin and @alimanfoo ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1800/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 282061228,MDU6SXNzdWUyODIwNjEyMjg=,1781,UnboundLocalError when opening netCDF file,16152387,closed,0,,3008859,1,2017-12-14T11:01:23Z,2018-01-11T16:53:09Z,2018-01-11T16:53:09Z,NONE,,,,"#### Code Sample, a copy-pastable example if possible ```python import xarray as xr import netCDF4 as nc product = '/Users/stefano/src/s5p/products/NO2/'\ 'S5P_OFFL_L2__NO2____20171107T195219_20171107T213349_00361_01_001107_20171108T122727.nc' # opening the product with netCDF4 works fine no2 = nc.Dataset(product) no2.groups # correctly shows groups' content ``` ``` OrderedDict([('PRODUCT', group /PRODUCT: dimensions(sizes): scanline(5640), ground_pixel(450), corner(4), time(1), polynomial_exponents(6), layer(34), vertices(2) variables(dimensions): int32 scanline(scanline), int32 ground_pixel(ground_pixel), int32 time(time), int32 corner(corner), int32 polynomial_exponents(polynomial_exponents), int32 layer(layer), int32 vertices(vertices), float32 latitude(time,scanline,ground_pixel), float32 longitude(time,scanline,ground_pixel), int32 delta_time(time,scanline), time_utc(time,scanline), uint8 qa_value(time,scanline,ground_pixel), float32 nitrogendioxide_tropospheric_column(time,scanline,ground_pixel), float32 nitrogendioxide_tropospheric_column_precision(time,scanline,ground_pixel), float32 averaging_kernel(time,scanline,ground_pixel,layer), float32 air_mass_factor_troposphere(time,scanline,ground_pixel), float32 air_mass_factor_total(time,scanline,ground_pixel), int32 tm5_tropopause_layer_index(time,scanline,ground_pixel), float32 tm5_constant_a(layer,vertices), float32 tm5_constant_b(layer,vertices) groups: SUPPORT_DATA), ('METADATA', group /METADATA: dimensions(sizes): variables(dimensions): groups: QA_STATISTICS, ALGORITHM_SETTINGS, GRANULE_DESCRIPTION, ISO_METADATA, EOP_METADATA, ESA_METADATA)]) ``` ``` # opening the product with xarray raises an UnboundLocalError exception no2 = xr.open_dataset(product, group='/PRODUCT') ``` ``` --------------------------------------------------------------------------- UnboundLocalError Traceback (most recent call last) in () ----> 1 no2 = xr.open_dataset(product, group='/PRODUCT') /Users/stefano/anaconda/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables) 303 lock = _default_lock(filename_or_obj, engine) 304 with close_on_error(store): --> 305 return maybe_decode_store(store, lock) 306 else: 307 if engine is not None and engine != 'scipy': /Users/stefano/anaconda/lib/python3.6/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 223 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 224 concat_characters=concat_characters, decode_coords=decode_coords, --> 225 drop_variables=drop_variables) 226 227 _protect_dataset_variables_inplace(ds, cache) /Users/stefano/anaconda/lib/python3.6/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 1153 vars, attrs, coord_names = decode_cf_variables( 1154 vars, attrs, concat_characters, mask_and_scale, decode_times, -> 1155 decode_coords, drop_variables=drop_variables) 1156 ds = Dataset(vars, attrs=attrs) 1157 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) /Users/stefano/anaconda/lib/python3.6/site-packages/xarray/conventions.py in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 1086 k, v, concat_characters=concat_characters, 1087 mask_and_scale=mask_and_scale, decode_times=decode_times, -> 1088 stack_char_dim=stack_char_dim) 1089 if decode_coords: 1090 var_attrs = new_vars[k].attrs /Users/stefano/anaconda/lib/python3.6/site-packages/xarray/conventions.py in decode_cf_variable(name, var, concat_characters, mask_and_scale, decode_times, decode_endianness, stack_char_dim) 998 if (has_fill or scale_factor is not None or add_offset is not None): 999 if has_fill and np.array(fill_value).dtype.kind in ['U', 'S', 'O']: -> 1000 if string_encoding is not None: 1001 raise NotImplementedError( 1002 'variable %r has a _FillValue specified, but ' UnboundLocalError: local variable 'string_encoding' referenced before assignment ``` ```python # Opening another group with xarray works fine no2 = xr.open_dataset(product, group='/METADATA/QA_STATISTICS') no2 ``` ``` Dimensions: (nitrogendioxide_stratospheric_column_histogram_axis: 100, nitrogendioxide_stratospheric_column_pdf_axis: 400, nitrogendioxide_total_column_histogram_axis: 100, nitrogendioxide_total_column_pdf_axis: 400, nitrogendioxide_tropospheric_column_histogram_axis: 100, nitrogendioxide_tropospheric_column_pdf_axis: 400, vertices: 2) [...] ``` #### Problem description An UnboundLocalError exception is raised while trying to open a netCDF file on a specific group ('```/PRODUCT```'). Opening the file within another group works fine. The same file can be correctly opened with the netCDF4 library. #### Expected Output File correctly loaded. #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.6.1.final.0 python-bits: 64 OS: Darwin OS-release: 17.2.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: None LOCALE: None.None xarray: 0.10.0 pandas: 0.20.1 numpy: 1.12.1 scipy: 0.19.0 netCDF4: 1.2.4 h5netcdf: 0.5.0 Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.3 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 27.2.0 pip: 9.0.1 conda: 4.3.29 pytest: 3.0.7 IPython: 5.3.0 sphinx: 1.5.6
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 281020451,MDU6SXNzdWUyODEwMjA0NTE=,1775,AttributeError: 'PydapArrayWrapper' object has no attribute 'shape',6815953,closed,0,,3008859,3,2017-12-11T13:41:20Z,2018-01-09T01:48:13Z,2018-01-09T01:48:13Z,NONE,,,,"#### Code Sample, a copy-pastable example if possible See my code [here](https://gist.github.com/kuchaale/422a37851113ad0a52b28bc14d296674) #### Problem description I received `AttributeError: 'PydapArrayWrapper' object has no attribute 'shape'` when I tried to open `PydapDataStore`. However, everything works when I use `pydap` instead of `xarray`. #### Expected Output `xarray.Dataset` object #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.5.2.final.0 python-bits: 64 OS: Linux OS-release: 4.10.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0 pandas: 0.21.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.4 matplotlib: 2.1.0 cartopy: 0.15.1 seaborn: 0.8.1 setuptools: 36.6.0 pip: 9.0.1 conda: None pytest: None IPython: 6.2.1 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1775/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue