issues
28 rows where user = 3958036 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, updated_at, closed_at, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1798441216 | I_kwDOAMm_X85rMgkA | 7978 | Clean up colormap code? | johnomotani 3958036 | open | 0 | 1 | 2023-07-11T08:49:36Z | 2023-07-11T15:47:52Z | CONTRIBUTOR | In fixing some bugs with color bars (https://github.com/pydata/xarray/pull/3601), we had to do some clunky workarounds because of limitations of matplotlib's API for modifying colormaps - this prompted a matplotlib issue https://github.com/matplotlib/matplotlib/issues/16296#issuecomment-1629755861. That issue has now been closed, and apparently the limitations are now fixed, so it should be possible to tidy up some of the colorbar code (at least at some point, once the oldest xarray-supported matplotlib includes the new API). I have no time to look into this myself, but opening this issue to flag the new matplotlib features in case someone is looking at refactoring colorbar or colormap code. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/7978/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
666523009 | MDU6SXNzdWU2NjY1MjMwMDk= | 4276 | isel with 0d dask array index fails | johnomotani 3958036 | closed | 0 | 0 | 2020-07-27T19:14:22Z | 2023-03-15T02:48:01Z | 2023-03-15T02:48:01Z | CONTRIBUTOR | What happened:
If a 0d dask array is passed as an argument to What you expected to happen:
Minimal Complete Verifiable Example: ```python import dask.array as daskarray import numpy as np import xarray as xr a = daskarray.from_array(np.linspace(0., 1.)) da = xr.DataArray(a, dims="x") x_selector = da.argmax(dim=...) da_max = da.isel(x_selector) ``` Anything else we need to know?: I think the problem is here
https://github.com/pydata/xarray/blob/a198218ddabe557adbb04311b3234ec8d20419e7/xarray/core/variable.py#L546-L548
and May be related to #2511, but from the code snippet above, I think this is a specific issue of 0d dask arrays rather than a generic dask-indexing issue like #2511. I'd like to fix this because it breaks the nice new features of Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-42-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.0.5 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.19.0 distributed: 2.21.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.2.0.post20200712 pip: 20.1.1 conda: 4.8.3 pytest: 5.4.3 IPython: 7.15.0 sphinx: None |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4276/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
698577111 | MDU6SXNzdWU2OTg1NzcxMTE= | 4417 | Inconsistency in whether index is created with new dimension coordinate? | johnomotani 3958036 | closed | 0 | 6 | 2020-09-10T22:44:54Z | 2022-09-13T07:54:32Z | 2022-09-13T07:54:32Z | CONTRIBUTOR | It seems like (1) ``` import numpy as np import xarray as xr ds = xr.Dataset() ds['a'] = ('x', np.linspace(0,1)) ds['b'] = ('x', np.linspace(3,4)) ds = ds.rename(b='x') ds = ds.set_coords('x') print(ds) print('indexes', ds.indexes) ``` (2) ``` import numpy as np import xarray as xr ds = xr.Dataset() ds['a'] = ('x', np.linspace(0,1)) ds['x'] = ('x', np.linspace(3,4)) print(ds) print('indexes', ds.indexes) ``` Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-47-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.1 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.23.0 distributed: 2.25.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.6.0.post20200814 pip: 20.2.3 conda: 4.8.4 pytest: 5.4.3 IPython: 7.15.0 sphinx: 3.2.1 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4417/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
708337538 | MDU6SXNzdWU3MDgzMzc1Mzg= | 4456 | workaround for file with variable and dimension having same name | johnomotani 3958036 | closed | 0 | 4 | 2020-09-24T17:10:04Z | 2021-12-29T16:55:53Z | 2021-12-29T16:55:53Z | CONTRIBUTOR | Adding a variable that's not a 1d "dimension coordinate" with the same name as a dimension is an error. This makes sense. However, if I have a f = netCDF4.Dataset() f = netCDF4.Dataset("test.nc", "w") f.createDimension("x", 2) f.createDimension("y", 3) f["y"] = np.ones([2,3]) f["y"][...] = 1.0 f.close() ds = xr.open_dataset('test.nc')
I think it might be nice to have something like a |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4456/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
1022478180 | I_kwDOAMm_X8488cdk | 5852 | Surprising behaviour of Dataset/DataArray.interp() with NaN entries | johnomotani 3958036 | open | 0 | 0 | 2021-10-11T09:33:11Z | 2021-10-11T09:33:11Z | CONTRIBUTOR | I think this is due to documented 'undefined behaviour' of What happened: If a DataArray contains a NaN value and is interpolated, output values that do not depend on the entry that was NaN may still be NaN. What you expected to happen: The docs for
which explain the output below, and presumably mean it is not fixable on the xarray side (short of some ugly work-around). I think it would be good though to check for NaNs in
What I'd initially expected was an output would be valid at locations in the array that shouldn't depend on the NaN input: interpolating a 2d DataArray (with dims x and y) in the x-dimension, if only one y-index in the input has a NaN value, that y-index in the output might contain NaNs, but the others should be OK. Minimal Complete Verifiable Example: ```python import numpy as np import xarray as xr da = xr.DataArray(np.ones([3, 4]), dims=("x", "y")) da[0, 0] = float("nan") newx = np.linspace(0., 3., 5) interp_da = da.interp(x=newx) print(interp_da) ``` On my system, this gives output:
You might expect at least the following, with NaN only at Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.9.6 | packaged by conda-forge | (default, Jul 11 2021, 03:39:48) [GCC 9.3.0] python-bits: 64 OS: Linux OS-release: 5.11.0-37-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: ('en_GB', 'UTF-8') libhdf5: 1.10.6 libnetcdf: 4.8.0 xarray: 0.19.0 pandas: 1.3.1 numpy: 1.21.1 scipy: 1.7.1 netCDF4: 1.5.7 pydap: None h5netcdf: None h5py: 3.3.0 Nio: None zarr: None cftime: 1.5.0 nc_time_axis: 1.3.1 PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2021.07.2 distributed: 2021.07.2 matplotlib: 3.4.2 cartopy: None seaborn: 0.11.1 numbagg: None pint: 0.17 setuptools: 49.6.0.post20210108 pip: 21.2.4 conda: 4.10.3 pytest: 6.2.4 IPython: 7.26.0 sphinx: 4.1.2 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5852/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
940702754 | MDU6SXNzdWU5NDA3MDI3NTQ= | 5589 | Call .compute() in all plot methods? | johnomotani 3958036 | open | 0 | 3 | 2021-07-09T12:03:30Z | 2021-07-09T15:57:27Z | CONTRIBUTOR | I noticed what I think might be a performance bug: should I was making plots from a large dataset of a quantity that is the output of quite a bit of computation. A script which made an animation of the full time-series (a couple of thousand time points) actually ran significantly faster than a script that made pcolormesh plots of just 3 time points (~2hrs compared to ~5hrs). The difference I can think of is that the animation script called 2d plots might all be covered by adding a |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5589/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | issue | ||||||||
609907735 | MDExOlB1bGxSZXF1ZXN0NDExNDIxODIz | 4017 | Combining attrs of member DataArrays of Datasets | johnomotani 3958036 | closed | 0 | 0 | 2020-04-30T12:23:10Z | 2021-05-05T16:37:25Z | 2021-05-05T16:37:25Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/4017 | While looking at #4009, I noticed that the So far this PR adds tests that reproduce the issue in #4009, and the issue described above. Fixing should be fairly simple: for #4009 pass
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4017/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
847489988 | MDExOlB1bGxSZXF1ZXN0NjA2NTAyMzA4 | 5101 | Surface plots | johnomotani 3958036 | closed | 0 | 8 | 2021-03-31T22:58:20Z | 2021-05-03T13:05:59Z | 2021-05-03T13:05:02Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/5101 |
I'm not sure if there's somewhere good to note the new
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5101/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
842583817 | MDU6SXNzdWU4NDI1ODM4MTc= | 5084 | plot_surface() wrapper | johnomotani 3958036 | closed | 0 | 2 | 2021-03-27T19:16:09Z | 2021-05-03T13:05:02Z | 2021-05-03T13:05:02Z | CONTRIBUTOR | Is there an xarray way to make a surface plot, like matplotlib's |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5084/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
857378504 | MDExOlB1bGxSZXF1ZXN0NjE0ODA5NTg1 | 5153 | cumulative_integrate() method | johnomotani 3958036 | closed | 0 | 8 | 2021-04-13T22:53:58Z | 2021-05-02T10:34:23Z | 2021-05-01T20:01:31Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/5153 | Provides the functionality of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5153/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
823059252 | MDU6SXNzdWU4MjMwNTkyNTI= | 5002 | Dataset.plot.quiver() docs slightly misleading | johnomotani 3958036 | closed | 0 | 3 | 2021-03-05T12:54:19Z | 2021-05-01T17:38:39Z | 2021-05-01T17:38:39Z | CONTRIBUTOR | In the docs for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5002/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
847006334 | MDU6SXNzdWU4NDcwMDYzMzQ= | 5097 | 2d plots may fail for some choices of `x` and `y` | johnomotani 3958036 | closed | 0 | 1 | 2021-03-31T17:26:34Z | 2021-04-22T07:16:17Z | 2021-04-22T07:16:17Z | CONTRIBUTOR | What happened:
When making a 2d plot with a 1d What you expected to happen: All three plots in the MCVE should be identical. Minimal Complete Verifiable Example: ```python from matplotlib import pyplot as plt import numpy as np import xarray as xr ds = xr.Dataset({"z": (["x", "y"], np.random.rand(4,4))}) x2d, y2d = np.meshgrid(ds["x"], ds["y"]) ds = ds.assign_coords(x2d=(["x", "y"], x2d.T), y2d=(["x", "y"], y2d.T)) fig, axes = plt.subplots(1,3) h0 = ds["z"].plot.pcolormesh(x="y2d", y="x2d", ax=axes[0]) h1 = ds["z"].plot.pcolormesh(x="y", y="x", ax=axes[1]) h2 = ds["z"].plot.pcolormesh(x="y", y="x2d", ax=axes[2]) plt.show() ``` result:
Anything else we need to know?: The bug is present in both the 0.17.0 release and current I came across this while starting to work on #5084. I think the problem is here
https://github.com/pydata/xarray/blob/ddc352faa6de91f266a1749773d08ae8d6f09683/xarray/plot/plot.py#L678-L684
as the check Why don't we just do something like
Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Oct 7 2020, 19:08:05) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-70-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.17.0 pandas: 1.1.5 numpy: 1.19.4 scipy: 1.5.3 netCDF4: 1.5.5.1 pydap: None h5netcdf: None h5py: 3.1.0 Nio: None zarr: None cftime: 1.3.0 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2020.12.0 distributed: 2020.12.0 matplotlib: 3.3.3 cartopy: None seaborn: None numbagg: None pint: 0.16.1 setuptools: 49.6.0.post20201009 pip: 20.3.3 conda: 4.9.2 pytest: 6.2.1 IPython: 7.19.0 sphinx: 3.4.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5097/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
847199398 | MDExOlB1bGxSZXF1ZXN0NjA2MjMwMjE1 | 5099 | Use broadcast_like for 2d plot coordinates | johnomotani 3958036 | closed | 0 | 3 | 2021-03-31T19:34:32Z | 2021-04-22T07:16:17Z | 2021-04-22T07:16:17Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/5099 | Use broadcast_like if either @dcherian
This change seems to 'just work', and unit tests pass. Is there some extra check that needs doing to make sure "resolving intervals" is behaving correctly? I can't think of a unit test that would have caught #5097, since even when the bug happens, a plot is produced without errors or warnings. If anyone has an idea, suggestions/pushes welcome!
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5099/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
823290488 | MDExOlB1bGxSZXF1ZXN0NTg1Nzc0MjY4 | 5003 | Add Dataset.plot.streamplot() method | johnomotani 3958036 | closed | 0 | 2 | 2021-03-05T17:41:49Z | 2021-03-30T16:41:08Z | 2021-03-30T16:41:07Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/5003 | Since @dcherian added Quiver plots in #4407, it's fairly simple to extend the functionality to For example (copying from @dcherian's unit test setup) ``` import xarray as xr from matplotlib import pyplot as plt das = [
xr.DataArray(
np.random.randn(3, 3),
dims=["x", "y"],
coords=[range(k) for k in [3, 3]],
)
for _ in [1, 2]
]
ds = xr.Dataset({"u": das[0], "v": das[1]})
ds["mag"] = np.hypot(ds.u, ds.v)
ds.plot.streamplot(x="x",y="y",u="u",v="v", hue="mag")
plt.show()
```
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5003/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
698263021 | MDU6SXNzdWU2OTgyNjMwMjE= | 4415 | Adding new DataArray to a Dataset removes attrs of existing coord | johnomotani 3958036 | closed | 0 | 3 | 2020-09-10T17:21:32Z | 2020-09-10T17:38:08Z | 2020-09-10T17:38:08Z | CONTRIBUTOR | Minimal Complete Verifiable Example: ``` import numpy as np import xarray as xr ds = xr.Dataset() ds["a"] = xr.DataArray(np.linspace(0., 1.), dims="x") ds["x"] = xr.DataArray(np.linspace(0., 2., len(ds["x"])), dims="x") ds["x"].attrs["foo"] = "bar" print(ds["x"]) ds["b"] = xr.DataArray(np.linspace(0., 1.), dims="x") print(ds["x"]) ``` What happened:
Attribute full output</tt>``` <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 ```What you expected to happen:
Coordinate full expected output</tt>``` <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar ```Anything else we need to know?: Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-47-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.1 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.23.0 distributed: 2.25.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.6.0.post20200814 pip: 20.2.3 conda: 4.8.4 pytest: 5.4.3 IPython: 7.15.0 sphinx: 3.2.1 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
668515620 | MDU6SXNzdWU2Njg1MTU2MjA= | 4289 | title bar of docs displays incorrect version | johnomotani 3958036 | closed | 0 | 5 | 2020-07-30T09:00:43Z | 2020-08-18T22:32:51Z | 2020-08-18T22:32:51Z | CONTRIBUTOR | What happened:
The browser title bar displays an incorrect version when viewing the docs online. See below - title bar says 0.15.1 but actual version in URL is 0.16.0.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4289/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
671222189 | MDExOlB1bGxSZXF1ZXN0NDYxNDQzNzcx | 4298 | Fix docstring for missing_dims argument to isel methods | johnomotani 3958036 | closed | 0 | 1 | 2020-08-01T21:40:27Z | 2020-08-03T20:23:29Z | 2020-08-03T20:23:28Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/4298 | Incorrect value "exception" was given in the description of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4298/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
594594646 | MDExOlB1bGxSZXF1ZXN0Mzk5MjAwODg3 | 3936 | Support multiple dimensions in DataArray.argmin() and DataArray.argmax() methods | johnomotani 3958036 | closed | 0 | 27 | 2020-04-05T18:52:52Z | 2020-06-29T20:22:49Z | 2020-06-29T19:36:26Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/3936 | These return dicts of the indices of the minimum or maximum of a DataArray over several dimensions. Inspired by @fujiisoup's work in #1469. With #3871, replaces #1469. Provides a simpler solution to #3160. Implemented so that
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3936/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
644485140 | MDExOlB1bGxSZXF1ZXN0NDM5MDk4NTcz | 4173 | Fix 4009 | johnomotani 3958036 | closed | 0 | 1 | 2020-06-24T09:59:28Z | 2020-06-24T18:22:20Z | 2020-06-24T18:22:19Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/4173 | Don't know if/when I'll have time to finish #4017, so pulling out the fix for #4009 into a separate PR here that is ready to merge.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4173/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
595882590 | MDU6SXNzdWU1OTU4ODI1OTA= | 3948 | Releasing memory? | johnomotani 3958036 | closed | 0 | 6 | 2020-04-07T13:49:07Z | 2020-04-07T14:18:36Z | 2020-04-07T14:18:36Z | CONTRIBUTOR | Once For example, what would be the best workflow for this case: I have several large arrays on disk. Each will fit into memory individually. I want to do some analysis on each array (which produces small results), and keep the results in memory, but I do not need the large arrays any more after the analysis. I'm wondering if some sort of da2 = ds["variable2"] result2 = do_some_work(da2) # may load large parts of da2 into memory da2.release() # any changes to da2 not already saved to disk are lost, but do not want da1 any more ... etc. ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3948/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
585868107 | MDExOlB1bGxSZXF1ZXN0MzkyMTExMTI4 | 3877 | Control attrs of result in `merge()`, `concat()`, `combine_by_coords()` and `combine_nested()` | johnomotani 3958036 | closed | 0 | 7 | 2020-03-23T01:32:59Z | 2020-04-05T20:44:47Z | 2020-03-24T20:40:18Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/3877 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3877/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
583220835 | MDU6SXNzdWU1ODMyMjA4MzU= | 3866 | Allow `isel` to ignore missing dimensions? | johnomotani 3958036 | closed | 0 | 0 | 2020-03-17T18:41:13Z | 2020-04-03T19:47:08Z | 2020-04-03T19:47:08Z | CONTRIBUTOR | Sometimes it would be nice for ds.isel(t=0) # currently raises an exception ds.isel(t=0, ignore_missing=True) # would be nice if this was allowed, just returning ds ``` For example, when writing a function can be called on variables with different combinations of dimensions. I think it should be fairly easy to implement, just add the argument to the condition here https://github.com/pydata/xarray/blob/65a5bff79479c4b56d6f733236fe544b7f4120a8/xarray/core/variable.py#L1059-L1062 the only downside would be increased complexity of adding another argument to the API for an issue where a workaround is not hard (at least in the case I have at the moment), just a bit clumsy. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3866/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
591471233 | MDExOlB1bGxSZXF1ZXN0Mzk2NjM5NjM2 | 3923 | Add missing_dims argument allowing isel() to ignore missing dimensions | johnomotani 3958036 | closed | 0 | 5 | 2020-03-31T22:19:54Z | 2020-04-03T19:47:08Z | 2020-04-03T19:47:08Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/3923 | Note: only added to
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3923/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
587280307 | MDExOlB1bGxSZXF1ZXN0MzkzMjU2NTgx | 3887 | Rename ordered_dict_intersection -> compat_dict_intersection | johnomotani 3958036 | closed | 0 | 4 | 2020-03-24T21:08:26Z | 2020-03-24T22:59:07Z | 2020-03-24T22:59:07Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/3887 |
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3887/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
583080947 | MDU6SXNzdWU1ODMwODA5NDc= | 3865 | `merge` drops attributes | johnomotani 3958036 | closed | 0 | 1 | 2020-03-17T15:06:18Z | 2020-03-24T20:40:18Z | 2020-03-24T20:40:18Z | CONTRIBUTOR |
MCVE Code Sample```python Your code hereimport xarray as xr ds1 = xr.Dataset() ds1.attrs['a'] = 42 ds2 = xr.Dataset() ds2.attrs['a'] = 42 merged = xr.merge([ds1, ds2]) print(merged)
Expected Output
Problem DescriptionThe current behaviour means I have to check and copy I'm happy to attempt a PR to fix this.
Proposal (following pattern of This proposal should also allow VersionsCurrent Output of `xr.show_versions()`INSTALLED VERSIONS ------------------ commit: None python: 3.6.9 (default, Nov 7 2019, 10:44:02) [GCC 8.3.0] python-bits: 64 OS: Linux OS-release: 5.3.0-40-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.2 libnetcdf: 4.6.3 xarray: 0.15.0 pandas: 1.0.2 numpy: 1.18.1 scipy: 1.3.0 netCDF4: 1.5.1.2 pydap: None h5netcdf: None h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.12.0 distributed: None matplotlib: 3.1.1 cartopy: None seaborn: None numbagg: None setuptools: 45.2.0 pip: 9.0.1 conda: None pytest: 4.4.1 IPython: 7.8.0 sphinx: 1.8.3 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3865/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
533996523 | MDExOlB1bGxSZXF1ZXN0MzQ5OTc0NjUz | 3601 | Fix contourf set under | johnomotani 3958036 | closed | 0 | 5 | 2019-12-06T13:47:37Z | 2020-02-24T20:20:09Z | 2020-02-24T20:20:08Z | CONTRIBUTOR | 0 | pydata/xarray/pulls/3601 | Copies the I'm not a fan of copying attributes one-by-one like this, but I guess this is an issue with matplotlib's API, unless there's a nicer way to convert a
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3601/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
xarray 13221727 | pull | |||||
532165408 | MDU6SXNzdWU1MzIxNjU0MDg= | 3590 | cmap.set_under() does not work as expected | johnomotani 3958036 | closed | 0 | 5 | 2019-12-03T18:04:07Z | 2020-02-24T20:20:07Z | 2020-02-24T20:20:07Z | CONTRIBUTOR | When using matplotlib, the dat = numpy.linspace(0, 1)[numpy.newaxis, :]*numpy.linspace(0, 1)[:, numpy.newaxis] cmap = matplotlib.cm.viridis cmap.set_under('w')pyplot.contourf(dat, vmin=.3, cmap=cmap)
pyplot.colorbar()
pyplot.show()
```
produces
while uncommenting the However, using da = DataArray(numpy.linspace(0, 1)[numpy.newaxis, :]*numpy.linspace(0, 1)[:, numpy.newaxis]) cmap = matplotlib.cm.viridis cmap.set_under('w') da.plot.contourf(vmin=.3, cmap=cmap)
pyplot.show()
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3590/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
507468507 | MDU6SXNzdWU1MDc0Njg1MDc= | 3401 | _get_scheduler() exception if dask.multiprocessing missing | johnomotani 3958036 | closed | 0 | 0 | 2019-10-15T20:35:14Z | 2019-10-21T00:17:48Z | 2019-10-21T00:17:48Z | CONTRIBUTOR | These lines were recently changed in #3358 https://github.com/pydata/xarray/blob/3f9069ba376afa35c0ca83b09a6126dd24cb8127/xarray/backends/locks.py#L87-L92 If the 'cloudpickle' package is not installed, then Suggest either reverting the changes that removed the To reproduce: 1. check 'cloudpickle' is not installed, but 'dask' is 2. execute the following commands ```
AttributeError Traceback (most recent call last) <ipython-input-2-20da238796b7> in <module> ----> 1 xarray.backends.api._get_scheduler() ~/.local/lib/python3.6/site-packages/xarray/backends/locks.py in _get_scheduler(get, collection) 87 pass 88 ---> 89 if actual_get is dask.multiprocessing.get: 90 return "multiprocessing" 91 else: AttributeError: module 'dask' has no attribute 'multiprocessing' ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3401/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);