id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 584429748,MDU6SXNzdWU1ODQ0Mjk3NDg=,3867,macos py38 CI failing,2448579,closed,0,,,3,2020-03-19T13:54:10Z,2020-03-29T22:13:26Z,2020-03-29T22:13:26Z,MEMBER,,,,"`import matplotlib` is failing when it imports `PIL` ```python E ImportError: dlopen(/usr/local/miniconda/envs/xarray-tests/lib/python3.8/site-packages/PIL/_imaging.cpython-38-darwin.so, 2): Library not loaded: @rpath/libwebp.7.dylib E Referenced from: /usr/local/miniconda/envs/xarray-tests/lib/libtiff.5.dylib E Reason: Incompatible library version: libtiff.5.dylib requires version 9.0.0 or later, but libwebp.7.dylib provides version 8.0.0 /usr/local/miniconda/envs/xarray-tests/lib/python3.8/site-packages/PIL/Image.py:69: ImportError ``` ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3867/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 200364693,MDU6SXNzdWUyMDAzNjQ2OTM=,1201,pass projection argument to plt.subplot when faceting with cartopy transform,731499,closed,0,,,10,2017-01-12T13:18:52Z,2020-03-29T16:30:29Z,2020-03-29T16:30:29Z,CONTRIBUTOR,,,,"I have a `data` 3D DataArray with `Time`, `Latitude` and `Longitude` coordinates. I want to plot maps of this dataset, faceted by Time. The following code ``` import cartopy.crs as ccrs proj = ccrs.PlateCarree() data.plot(transform=proj, col='Time', col_wrap=3, robust=True) ``` fails with ``` ValueError: Axes should be an instance of GeoAxes, got ``` this is because to plot with a transform, the axes must be a GeoAxes, which is done with something like `plt.subplot(111, projection=proj)`. The implicit subplotting done when faceting does not do that. To make the faceting works, I had to do ``` import cartopy.crs as ccrs proj = ccrs.PlateCarree() data.plot(transform=proj, col='Time', col_wrap=3, robust=True, subplot_kws={'projection':proj}) ``` I propose that, when plot faceting is requested with a `transform` kw, the content of that keyword should be passed to the subplot function as a `projection` argument automatically by default. If a projection is provided explicitely like in the call above, use that one.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/1201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 502060636,MDU6SXNzdWU1MDIwNjA2MzY=,3368,Shift DataArray along a coordinate for different values of each element of another coordinate,49461634,closed,0,,,1,2019-10-03T13:17:10Z,2020-03-29T14:18:11Z,2020-03-29T14:16:43Z,NONE,,,,"#### MCVE Code Sample ```python # Your code here Original_DataArray = Shift_Map = array([ 0, 0, 0, 24, 0, 24, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 24, 36, 60, 36, 0]) #""Multi"" Shift Function def _fn(dataArray,over_shift_index,multi_shift_index,shift_map,initialValues=None): """""" -dataArray: dataarray to shift -over_shift_index: index over apply shit of dataarray -multi_shift_index: index with the elements to apply a different value shift -shift_map: datarray indexed by multi_shift_index with the values to shift for each element """""" _da = dataArray.copy() _shift_map = shift_map.astype(int) for name, sl in _da.groupby(multi_shift_index.name): _shift = subscript(_shift_map, multi_shift_index, name ).values.tolist() _sl = sl.squeeze(multi_shift_index.name).shift(time = _shift) _dict = {multi_shift_index.name : name} _da.loc[_dict] = _sl return _da.fillna(0.) ``` #### Expected Output Same estructure DataArray shifted along the ""time"" coordinate by a different value for each element of the ""res_segmentacao"" coordinate. ```python exp_output= multidynamic( Original_DataArray , time, res_segmentacao, Shift_Map ) ``` #### Problem Description I´ve reached my objective but I wanted to consult if anyone had donde this in a more efficient way. Thanks! ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 327392061,MDU6SXNzdWUzMjczOTIwNjE=,2196,inconsistent time coordinates types ,35919497,closed,0,,,1,2018-05-29T16:14:27Z,2020-03-29T14:09:26Z,2020-03-29T14:09:26Z,COLLABORATOR,,,,"#### Code Sample, a copy-pastable example if possible ```python import numpy as np import pandas as pd import xarray as xr time = np.arange('2005-02-01', '2007-03-01', dtype='datetime64') arr = xr.DataArray( np.arange(time.size), coords=[time,], dims=('time',), name='data' ) arr.resample(time='M').interpolate('linear') --------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () 7 np.arange(time.size), coords=[time,], dims=('time',), name='data' 8 ) ----> 9 arr.resample(time='M').interpolate('linear') ~/devel/c3s-cns/venv_op/lib/python3.6/site-packages/xarray/core/resample.py in interpolate(self, kind) 108 109 """""" --> 110 return self._interpolate(kind=kind) 111 112 def _interpolate(self, kind='linear'): ~/devel/c3s-cns/venv_op/lib/python3.6/site-packages/xarray/core/resample.py in _interpolate(self, kind) 218 elif self._dim not in v.dims: 219 coords[k] = v --> 220 return DataArray(f(new_x), coords, dims, name=dummy.name, 221 attrs=dummy.attrs) 222 ~/devel/c3s-cns/venv_op/lib/python3.6/site-packages/scipy/interpolate/polyint.py in __call__(self, x) 77 """""" 78 x, x_shape = self._prepare_x(x) ---> 79 y = self._evaluate(x) 80 return self._finish_y(y, x_shape) 81 ~/devel/c3s-cns/venv_op/lib/python3.6/site-packages/scipy/interpolate/interpolate.py in _evaluate(self, x_new) 632 y_new = self._call(self, x_new) 633 if not self._extrapolate: --> 634 below_bounds, above_bounds = self._check_bounds(x_new) 635 if len(y_new) > 0: 636 # Note fill_value must be broadcast up to the proper size ~/devel/c3s-cns/venv_op/lib/python3.6/site-packages/scipy/interpolate/interpolate.py in _check_bounds(self, x_new) 664 ""range."") 665 if self.bounds_error and above_bounds.any(): --> 666 raise ValueError(""A value in x_new is above the interpolation "" 667 ""range."") 668 ValueError: A value in x_new is above the interpolation range. ``` #### Problem description The internal format of _arr.time_ is datetime64[D] ```python arr.time array(['2005-02-01', '2005-02-02', '2005-02-03', ..., '2007-02-26', '2007-02-27', '2007-02-28'], dtype='datetime64[D]') Coordinates: * time (time) datetime64[D] 2005-02-01 2005-02-02 2005-02-03 ... ``` Internally there is a cast to float, for both the old time indices **_x_** and the new time indices **_new_x_**, but the new time indices are in datetime64[ns], so they don't match. DataArrayResample._interpolate ```python x = self._obj[self._dim].astype('float') y = self._obj.data axis = self._obj.get_axis_num(self._dim) f = interp1d(x, y, kind=kind, axis=axis, bounds_error=True, assume_sorted=True) new_x = self._full_index.values.astype('float') ``` With a cast to datetime64[ns] it works: ```python import numpy as np import pandas as pd import xarray as xr time = np.arange('2005-02-01', '2007-03-01', dtype='datetime64').astype('datetime64[ns]') arr = xr.DataArray( np.arange(time.size), coords=[time,], dims=('time',), name='data' ) arr.resample(time='M').interpolate('linear') array([ 27., 58., 88., 119., 149., 180., 211., 241., 272., 302., 333., 364., 392., 423., 453., 484., 514., 545., 576., 606., 637., 667., 698., 729., 757.]) Coordinates: * time (time) datetime64[ns] 2005-02-28 2005-03-31 2005-04-30 ... ``` #### Expected Output ```python array([ 27., 58., 88., 119., 149., 180., 211., 241., 272., 302., 333., 364., 392., 423., 453., 484., 514., 545., 576., 606., 637., 667., 698., 729., 757.]) Coordinates: * time (time) datetime64[ns] 2005-02-28 2005-03-31 2005-04-30 ... ``` #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.6.0.final.0 python-bits: 64 OS: Linux OS-release: 4.13.0-43-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.4 pandas: 0.20.3 numpy: 1.13.1 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None h5py: None Nio: None zarr: None bottleneck: None cyordereddict: None dask: 0.16.1 distributed: None matplotlib: 2.0.2 cartopy: None seaborn: None setuptools: 38.4.0 pip: 10.0.1 conda: None pytest: 3.4.0 IPython: 6.1.0 sphinx: None
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 445355249,MDU6SXNzdWU0NDUzNTUyNDk=,2970,decode_cf,6872529,closed,0,,,1,2019-05-17T09:41:28Z,2020-03-29T14:02:24Z,2020-03-29T14:02:24Z,NONE,,,,"To me this name is a bit confusing as it actually _encodes_ an in memory object to look like a _decoded_ netcdf file. I have a class which inherits (I know about `dataset_accessor`, still seems easier to simply inherit...) ``xarray.Dataset`` to which I've added a method `_make_cf`: ```python import xarray as xr class XResult(xr.Dataset): def __init__(self, data=None, coords=None, attrs=None, **kwargs): if isinstance(data, str): kwargs = dict(READ_KWARGS, **kwargs) with xr.open_dataset(data, **kwargs) as data: attrs = data.attrs super().__init__(data, coords, attrs) self._make_cf() def _make_cf(self): self = xr.decode_cf(self) ``` I expect the XResult object to be _decoded_ but it is not. if I do `xresult = xarray.decode_cf(XResult(...))` than the object is indeed _decoded_ but is no longer an `XResult` object and loses the attached functionality. It would be quite convenient to have a `decode_cf` or `encode_cf` method part of the `Dataset` class that will operate inplace.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 438694589,MDU6SXNzdWU0Mzg2OTQ1ODk=,2932,Facetgrid: colors beyond range (extend) not saturated,7933853,closed,0,,,5,2019-04-30T09:56:46Z,2020-03-29T13:26:43Z,2020-03-29T13:26:42Z,NONE,,,,"#### Code Sample, a copy-pastable example if possible ![Screen Shot 2019-04-30 at 11 59 30](https://user-images.githubusercontent.com/7933853/56954588-70238b00-6b3f-11e9-90ad-81045e500987.png) Minimal example here: https://github.com/lvankampenhout/bug-reports/blob/master/Facetgrid_cmap_extend.ipynb #### Problem description The extreme colors of neither the pcolormesh or colorbar (using extend='both') are not saturated as they should when faceting. #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.6.8 |Anaconda custom (x86_64)| (default, Dec 29 2018, 19:04:46) [GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)] python-bits: 64 OS: Darwin OS-release: 17.7.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: C LANG: None LOCALE: None.None libhdf5: 1.10.1 libnetcdf: 4.4.1.1 xarray: 0.12.1 pandas: 0.23.4 numpy: 1.14.2 scipy: 0.18.1 netCDF4: 1.3.1 pydap: None h5netcdf: None h5py: 2.7.1 Nio: None zarr: None cftime: 1.0.0b1 nc_time_axis: None PseudonetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.2.0 dask: 0.13.0 distributed: None matplotlib: 3.0.2 cartopy: 0.16.0 seaborn: 0.7.1 setuptools: 38.5.1 pip: 9.0.1 conda: 4.6.14 pytest: 3.0.5 IPython: 5.1.0 sphinx: 1.5.1
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/2932/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 561312864,MDU6SXNzdWU1NjEzMTI4NjQ=,3759,Truncate long lines in repr of coords,5635139,closed,0,,,1,2020-02-06T22:41:13Z,2020-03-29T09:58:46Z,2020-03-29T09:58:45Z,MEMBER,,,,"#### MCVE Code Sample ```python xr.DataArray(coords=dict(a=' '.join(['hello world' for _ in range(100)]))) array(nan) Coordinates: a array(nan) Coordinates: a I think mostly the same as https://github.com/pydata/xarray/issues/1319 but for coords #### Output of ``xr.show_versions()``
INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: [...] machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.utf8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.15.0 pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.2 netCDF4: 1.5.3 pydap: None h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.2.1 dask: 2.7.0 distributed: 2.7.0 matplotlib: 3.1.2 cartopy: None seaborn: 0.9.0 numbagg: None setuptools: 41.6.0.post20191101 pip: 19.3.1 conda: None pytest: 5.2.2 IPython: 7.9.0 sphinx: 2.2.1
","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 588821932,MDU6SXNzdWU1ODg4MjE5MzI=,3899,_indexes of DataArray are not deep copied,2272878,closed,0,,,4,2020-03-27T01:19:07Z,2020-03-29T02:01:20Z,2020-03-29T02:01:20Z,CONTRIBUTOR,,,,"In `DataArray.copy`, the `_indexes` attributes is not deep copied. After pull request #3840, this causes deleting a coordinate of a copy will also delete that coordinate from the original, even for deep copies. #### MCVE Code Sample ```python a0 = xr.DataArray( np.array([[1, 2, 3], [4, 5, 6]]), dims=[""y"", ""x""], coords={""x"": [""a"", ""b"", ""c""], ""y"": [-1, 1]}, ) a1 = a0.copy() del a1.coords[""y""] xr.tests.assert_identical(a0, a0) ``` The result is: ``` xarray/testing.py:272: in _assert_internal_invariants _assert_dataarray_invariants(xarray_obj) xarray/testing.py:222: in _assert_dataarray_invariants _assert_indexes_invariants_checks(da._indexes, da._coords, da.dims) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ indexes = {'x': Index(['a', 'b', 'c'], dtype='object', name='x')}, possible_coord_variables = {'x': array(['a', 'b', 'c'], dtype=' array([-1, 1])} dims = ('y', 'x') def _assert_indexes_invariants_checks(indexes, possible_coord_variables, dims): assert isinstance(indexes, dict), indexes assert all(isinstance(v, pd.Index) for v in indexes.values()), { k: type(v) for k, v in indexes.items() } index_vars = { k for k, v in possible_coord_variables.items() if isinstance(v, IndexVariable) } assert indexes.keys() <= index_vars, (set(indexes), index_vars) # Note: when we support non-default indexes, these checks should be opt-in # only! defaults = default_indexes(possible_coord_variables, dims) > assert indexes.keys() == defaults.keys(), (set(indexes), set(defaults)) E AssertionError: ({'x'}, {'y', 'x'}) xarray/testing.py:185: AssertionError ``` #### Expected Output The test should pass. #### Problem Description Doing a deep copy should make a copy of everything. Changing a deep copy should not alter the original in any way.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/3899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue 29136905,MDU6SXNzdWUyOTEzNjkwNQ==,60,Implement DataArray.idxmax(),1217238,closed,0,,741199,14,2014-03-10T22:03:06Z,2020-03-29T01:54:25Z,2020-03-29T01:54:25Z,MEMBER,,,,"Should match the pandas function: http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.idxmax.html ","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/60/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed,13221727,issue