home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

48 rows where user = 30388627 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 24

  • Interpolate 3D array by another 3D array 5
  • Sort DataArray by data values along one dim 5
  • Using min() with skipna=True 4
  • Coordinates passed to interp have nan values 4
  • Spurious lines of the pcolormesh example 4
  • Sum based on start_index and end_index array 3
  • Concatenate DataArrays on one dim when another dim has difference sizes 3
  • open_mfdataset change the attributes of Coordinates 2
  • Change the label size and tick label size of colorbar 2
  • Support `range` in `groupby_bins` 2
  • add scatter plot method to dataset 1
  • Exact alignment should allow missing dimension coordinates 1
  • Save 'S1' array without the char_dim_name dimension 1
  • Index 3D array with index of last axis stored in 2D array 1
  • Concatenate 3D array with 2D array 1
  • Masking and preserving int type 1
  • interpolate_na doesn't support extrapolation 1
  • Reimplement GroupBy.argmax 1
  • Set `allow_rechunk=True` still raise different lengths error 1
  • Missing linked coordinates of subgroup variable 1
  • uint type data are read as wrong type (float64) 1
  • Issue on page /examples/multidimensional-coords.html 1
  • extrapolate not working for multi-dimentional data 1
  • Support `skipna` in `.where()` 1

user 1

  • zxdawn · 48 ✖

author_association 1

  • NONE 48
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1163517013 https://github.com/pydata/xarray/issues/6713#issuecomment-1163517013 https://api.github.com/repos/pydata/xarray/issues/6713 IC_kwDOAMm_X85FWdxV zxdawn 30388627 2022-06-22T19:25:50Z 2022-06-22T19:25:50Z NONE

Thanks for the tip! It works well.

Is it better to raise a warning or something else to remind users there're nan values in the mask?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support `skipna` in `.where()` 1279891109
1030874975 https://github.com/pydata/xarray/issues/470#issuecomment-1030874975 https://api.github.com/repos/pydata/xarray/issues/470 IC_kwDOAMm_X849cedf zxdawn 30388627 2022-02-06T17:15:46Z 2022-02-06T17:15:46Z NONE

@aidanheerdegen Thanks for the code. I suppose it's better to mention this method for DataArray in User Guide. @dcherian Should I create a PR for example like this?

``` air = xr.tutorial.open_dataset("air_temperature")['air']

air.isel(lon=10, lat=[19, 21, 22]).plot.line(x="time", marker='o',linewidth=0.,markersize=1) ```

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  add scatter plot method to dataset 94787306
1020614105 https://github.com/pydata/xarray/issues/6188#issuecomment-1020614105 https://api.github.com/repos/pydata/xarray/issues/6188 IC_kwDOAMm_X8481VXZ zxdawn 30388627 2022-01-24T22:27:48Z 2022-01-24T22:27:48Z NONE

@andersy005 Thanks a lot, I realize that it's mentioned in the comments of Guide.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  extrapolate not working for multi-dimentional data 1112925311
1001690367 https://github.com/pydata/xarray/issues/6085#issuecomment-1001690367 https://api.github.com/repos/pydata/xarray/issues/6085 IC_kwDOAMm_X847tJT_ zxdawn 30388627 2021-12-27T18:25:45Z 2021-12-27T18:25:45Z NONE

Hi @TomNicholas, thanks and yes that's the same issue. Shall we close this duplicated one?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Missing linked coordinates of subgroup variable 1083806365
998948479 https://github.com/pydata/xarray/issues/6095#issuecomment-998948479 https://api.github.com/repos/pydata/xarray/issues/6095 IC_kwDOAMm_X847ir5_ zxdawn 30388627 2021-12-21T17:06:32Z 2021-12-21T17:06:32Z NONE

Old figure:

New figure:

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Issue on page /examples/multidimensional-coords.html 1086038682
998910844 https://github.com/pydata/xarray/issues/6091#issuecomment-998910844 https://api.github.com/repos/pydata/xarray/issues/6091 IC_kwDOAMm_X847iit8 zxdawn 30388627 2021-12-21T16:15:14Z 2021-12-21T16:15:14Z NONE

Ha, thanks. It makes sense now. Shall we close this?

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  uint type data are read as wrong type (float64) 1085619598
953852124 https://github.com/pydata/xarray/issues/5901#issuecomment-953852124 https://api.github.com/repos/pydata/xarray/issues/5901 IC_kwDOAMm_X8442qDc zxdawn 30388627 2021-10-28T13:35:59Z 2021-10-28T13:51:24Z NONE

@jklymak Thanks for the explanation.

To get the old behaviour you simply need to do pcolormesh(x, y, Z[:-1, :-1], shading='flat') or make x and y one larger than Z in each dimension and specify the the corners of the quadrilaterals.

Method1: Subset value

This method works for the xarray tutorial data, but not for the TROPOMI polar-orbiting satellite data.

``` %matplotlib inline

import xarray as xr import cartopy.crs as ccrs import matplotlib.pyplot as plt

plt.figure(figsize=(14,6)) ax = plt.axes(projection=ccrs.PlateCarree())

ds = xr.open_dataset('./S5P_OFFL_L2__NO2____20190810T212136_20190810T230306_09456_01_010302_20190816T233944.nc', group='PRODUCT').isel(time=0)

m = ax.pcolormesh(ds['longitude'], ds['latitude'], ds['nitrogendioxide_tropospheric_column'][:-1, :-1], # ds['nitrogendioxide_tropospheric_column'], # shading='auto', transform=ccrs.PlateCarree(), vmin=0, vmax=1e-4, cmap='Spectral_r') ```

(The TROPOMI example data is uploaded to Google Drive)

Method2: bounds

This issue still exists with bounds data: ``` %matplotlib inline import numpy as np import xarray as xr import cartopy.crs as ccrs import matplotlib.pyplot as plt

def prepare_geo(bounds_data): """Prepare lat/lon bounds for pcolormesh. lat/lon bounds are ordered in the following way:: 3----2 | | 0----1 Extend longitudes and latitudes with one element to support "pcolormesh":: (X[i+1, j], Y[i+1, j]) (X[i+1, j+1], Y[i+1, j+1]) +--------+ | C[i,j] | +--------+ (X[i, j], Y[i, j]) (X[i, j+1], Y[i, j+1]) """ # Create the left array left = np.vstack([bounds_data[:, :, 0], bounds_data[-1:, :, 3]]) # Create the right array right = np.vstack([bounds_data[:, -1:, 1], bounds_data[-1:, -1:, 2]]) # Stack horizontally dest = np.hstack([left, right]) # Convert to DataArray dest = xr.DataArray(dest, dims=('y_bounds', 'x_bounds'), attrs=bounds_data.attrs ) return dest

ds = xr.open_dataset('./S5P_OFFL_L2__NO2_20190810T21213620190810T230306_09456_01_010302_20190816T233944.nc', group='PRODUCT').isel(time=0) ds_geo = xr.open_dataset('./S5P_OFFL_L2NO2____20190810T212136_20190810T230306_09456_01_010302_20190816T233944.nc', group='/PRODUCT/SUPPORT_DATA/GEOLOCATIONS').isel(time=0)

lon_bounds = prepare_geo(ds_geo['longitude_bounds']) lat_bounds = prepare_geo(ds_geo['latitude_bounds'])

plt.figure(figsize=(14,6)) ax = plt.axes(projection=ccrs.PlateCarree())

m = ax.pcolormesh(lon_bounds, lat_bounds, ds['nitrogendioxide_tropospheric_column'], # shading='auto', transform=ccrs.PlateCarree(), vmin=0, vmax=1e-4, cmap='Spectral_r') ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Spurious lines of the pcolormesh example 1037814301
953350140 https://github.com/pydata/xarray/issues/5901#issuecomment-953350140 https://api.github.com/repos/pydata/xarray/issues/5901 IC_kwDOAMm_X8440vf8 zxdawn 30388627 2021-10-27T22:13:44Z 2021-10-27T22:13:44Z NONE

@QuLogic Ha, it looks well with the latest cartopy (0.20.1). Thanks a lot.

@TomNicholas So, is it better to keep this open until the doc is updated?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Spurious lines of the pcolormesh example 1037814301
953298464 https://github.com/pydata/xarray/issues/5901#issuecomment-953298464 https://api.github.com/repos/pydata/xarray/issues/5901 IC_kwDOAMm_X8440i4g zxdawn 30388627 2021-10-27T20:47:18Z 2021-10-27T21:00:46Z NONE

@TomNicholas I checked the doc and this issue begins from v0.16.1. Note that there're also small spurious lines after v0.10.9. Before v0.10.9, the figure looks fine. It's may be also related to matplotlib ... CC @jklymak and @timhoffm.

{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
  Spurious lines of the pcolormesh example 1037814301
953295655 https://github.com/pydata/xarray/issues/5901#issuecomment-953295655 https://api.github.com/repos/pydata/xarray/issues/5901 IC_kwDOAMm_X8440iMn zxdawn 30388627 2021-10-27T20:42:58Z 2021-10-27T20:42:58Z NONE

BTW, the question on StackOverflow, which was raised by @gerritholl a long time ago, looks similar. I'm not sure whether this is the cartopy issue, CC @QuLogic, and @greglucas.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Spurious lines of the pcolormesh example 1037814301
855206219 https://github.com/pydata/xarray/issues/5439#issuecomment-855206219 https://api.github.com/repos/pydata/xarray/issues/5439 MDEyOklzc3VlQ29tbWVudDg1NTIwNjIxOQ== zxdawn 30388627 2021-06-05T08:34:52Z 2021-06-05T08:37:11Z NONE

Sorry for this issue. This actually caused by the missing args like input_core_dims, exclude_dims, etc.

Anyway, this one works well: res = xr.apply_ufunc(reduceat_np, dask_data, bins_reduceat[:5], input_core_dims=[['x'], ['new_x']], exclude_dims=set(('x',)), output_core_dims=[['new_x']], dask="parallelized", output_dtypes=[data.dtype], dask_gufunc_kwargs={'allow_rechunk': True}, ) res.compute()

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Set `allow_rechunk=True` still raise different lengths error 912149228
851300846 https://github.com/pydata/xarray/issues/5358#issuecomment-851300846 https://api.github.com/repos/pydata/xarray/issues/5358 MDEyOklzc3VlQ29tbWVudDg1MTMwMDg0Ng== zxdawn 30388627 2021-05-31T08:12:22Z 2021-05-31T08:12:22Z NONE

@dcherian Has this method been improved in dask_groupby? Could you provide a simple example we can follow? I got lost in the dask_groupby documentation ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support `range` in `groupby_bins` 897689314
845924589 https://github.com/pydata/xarray/issues/5358#issuecomment-845924589 https://api.github.com/repos/pydata/xarray/issues/5358 MDEyOklzc3VlQ29tbWVudDg0NTkyNDU4OQ== zxdawn 30388627 2021-05-21T12:44:39Z 2021-05-21T12:44:39Z NONE

@dcherian Thanks! That's simple ;) However, the groupby_bins method is a little different from binned_statistic.

binned_statistic:

All but the last (righthand-most) bin is half-open. In other words, if bins is [1, 2, 3, 4], then the first bin is [1, 2) (including 1, but excluding 2) and the second [2, 3). The last bin, however, is [3, 4], which includes 4.

groupby_bins:

right (bool, default: True) – Indicates whether the bins include the rightmost edge or not. If right == True (the default), then the bins [1,2,3,4] indicate (1,2], (2,3], (3,4].

So, let's check this shorter example: ``` from scipy.stats import binned_statistic import numpy as np import xarray as xr

--- scipy method ---

x = np.arange(10) values = x*5 statistics, _, _ = binned_statistic(x, values, statistic='min', bins=10, range=(0, 10))

--- xarray method ---

x = xr.DataArray(x) values = xr.DataArray(values) bin_res = values.groupby_bins('dim_0', bins=np.linspace(0, 10, 10), right=False, include_lowest=True).min()

print('scipy: \n', statistics) print('xarray: \n', bin_res) ```

Output: ``` scipy: [ 0. 5. 10. 15. 20. 25. 30. 35. 40. 45.]

xarray: <xarray.DataArray (dim_0_bins: 9)> array([ 0, 10, 15, 20, 25, 30, 35, 40, 45]) Coordinates: * dim_0_bins (dim_0_bins) object [0.0, 1.111) ... [8.889, 10.0) ```

The scipy method has one more value ...

Summary

These produce the same results: binned_statistic(x, values, statistic='min', bins=10, range=(0, 10)) values.groupby_bins('dim_0', bins=np.linspace(0, 10, 11), right=False, include_lowest=True).min()

Output: scipy: [ 0. 5. 10. 15. 20. 25. 30. 35. 40. 45.] xarray: <xarray.DataArray (dim_0_bins: 10)> array([ 0, 5, 10, 15, 20, 25, 30, 35, 40, 45]) Coordinates: * dim_0_bins (dim_0_bins) object [0.0, 1.0) [1.0, 2.0) ... [9.0, 10.0)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Support `range` in `groupby_bins` 897689314
788732242 https://github.com/pydata/xarray/issues/4476#issuecomment-788732242 https://api.github.com/repos/pydata/xarray/issues/4476 MDEyOklzc3VlQ29tbWVudDc4ODczMjI0Mg== zxdawn 30388627 2021-03-02T08:42:51Z 2021-03-02T08:42:51Z NONE

@markusritschel I tested v0.17.0 and it doesn't work.

data = np.random.rand(4, 3) locs = ["IA", "IL", "IN"] times = pd.date_range("2000-01-01", periods=4) foo = xr.DataArray(data, coords=[times, locs], dims=["time", "space"]) foo.groupby('space').argmax('time')

Error: AttributeError: 'DataArrayGroupBy' object has no attribute 'argmax'

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Reimplement GroupBy.argmax 712217045
707744782 https://github.com/pydata/xarray/issues/3957#issuecomment-707744782 https://api.github.com/repos/pydata/xarray/issues/3957 MDEyOklzc3VlQ29tbWVudDcwNzc0NDc4Mg== zxdawn 30388627 2020-10-13T13:38:34Z 2020-10-13T13:38:34Z NONE

@JavierRuano I find the simpler solution from a similar question in stack overflow.

sort_pair = np.take_along_axis(pair.values, cld.argsort(axis=0), axis=0)

Complete example

``` import xarray as xr import numpy as np

x = 4 y = 2 z = 4

data = np.arange(xyz).reshape(z, y, x)

3d array with coords

cld_1 = xr.DataArray(data, dims=['z', 'y', 'x'], coords={'z': np.arange(z)})

2d array without coords

cld_2 = xr.DataArray(np.arange(xy).reshape(y, x)1.5+1, dims=['y', 'x'])

expand 2d to 3d

cld_2 = cld_2.expand_dims(z=[4])

concat

cld = xr.concat([cld_1, cld_2], dim='z')

paired array

pair = cld.copy(data=np.arange(xy(z+1)).reshape(z+1, y, x))

sort_pair = np.take_along_axis(pair.values, cld.argsort(axis=0), axis=0)

print(cld) print(pair) print(sort_pair) ```

Output: ``` <xarray.DataArray (z: 5, y: 2, x: 4)> array([[[ 0. , 1. , 2. , 3. ], [ 4. , 5. , 6. , 7. ]],

   [[ 8. ,  9. , 10. , 11. ],
    [12. , 13. , 14. , 15. ]],

   [[16. , 17. , 18. , 19. ],
    [20. , 21. , 22. , 23. ]],

   [[24. , 25. , 26. , 27. ],
    [28. , 29. , 30. , 31. ]],

   [[ 1. ,  2.5,  4. ,  5.5],
    [ 7. ,  8.5, 10. , 11.5]]])

Coordinates: * z (z) int64 0 1 2 3 4 Dimensions without coordinates: y, x <xarray.DataArray (z: 5, y: 2, x: 4)> array([[[ 0, 1, 2, 3], [ 4, 5, 6, 7]],

   [[ 8,  9, 10, 11],
    [12, 13, 14, 15]],

   [[16, 17, 18, 19],
    [20, 21, 22, 23]],

   [[24, 25, 26, 27],
    [28, 29, 30, 31]],

   [[32, 33, 34, 35],
    [36, 37, 38, 39]]])

Coordinates: * z (z) int64 0 1 2 3 4 Dimensions without coordinates: y, x [[[ 0 1 2 3] [ 4 5 6 7]]

[[32 33 34 35] [36 37 38 39]]

[[ 8 9 10 11] [12 13 14 15]]

[[16 17 18 19] [20 21 22 23]]

[[24 25 26 27] ```

Note, I have to use pair.values instead of pair in the last sorting step. Otherwise, I will get this error:

IndexError: Unlabeled multi-dimensional array cannot be used for indexing: y

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sort DataArray by data values along one dim 596606599
688560368 https://github.com/pydata/xarray/issues/4410#issuecomment-688560368 https://api.github.com/repos/pydata/xarray/issues/4410 MDEyOklzc3VlQ29tbWVudDY4ODU2MDM2OA== zxdawn 30388627 2020-09-08T00:58:44Z 2020-09-08T00:58:44Z NONE

Thanks, it works well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  interpolate_na doesn't support extrapolation 694874737
621793960 https://github.com/pydata/xarray/issues/4016#issuecomment-621793960 https://api.github.com/repos/pydata/xarray/issues/4016 MDEyOklzc3VlQ29tbWVudDYyMTc5Mzk2MA== zxdawn 30388627 2020-04-30T12:12:11Z 2020-04-30T12:12:11Z NONE

@keewis Thanks! I will try to apply this method and check the results.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Concatenate DataArrays on one dim when another dim has difference sizes 609108666
621680403 https://github.com/pydata/xarray/issues/4016#issuecomment-621680403 https://api.github.com/repos/pydata/xarray/issues/4016 MDEyOklzc3VlQ29tbWVudDYyMTY4MDQwMw== zxdawn 30388627 2020-04-30T08:03:44Z 2020-04-30T08:03:44Z NONE

@dcherian Sorry. I made a mistake in the expected result. It should be [[0], [1, 2, 3]]

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Concatenate DataArrays on one dim when another dim has difference sizes 609108666
621300738 https://github.com/pydata/xarray/issues/4016#issuecomment-621300738 https://api.github.com/repos/pydata/xarray/issues/4016 MDEyOklzc3VlQ29tbWVudDYyMTMwMDczOA== zxdawn 30388627 2020-04-29T15:51:52Z 2020-04-29T15:51:52Z NONE

@JavierRuano The time indexes are same in my real case. Maybe, I have to merge these data if I can't find the solution.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Concatenate DataArrays on one dim when another dim has difference sizes 609108666
612561659 https://github.com/pydata/xarray/pull/3924#issuecomment-612561659 https://api.github.com/repos/pydata/xarray/issues/3924 MDEyOklzc3VlQ29tbWVudDYxMjU2MTY1OQ== zxdawn 30388627 2020-04-12T04:14:45Z 2020-04-12T04:14:45Z NONE

@dcherian Oh, thanks! After scipy is installed, it works: ``` =============================================== test session starts =============================================== platform linux -- Python 3.7.6, pytest-5.4.1, py-1.8.1, pluggy-0.12.0 -- /yin_raid/xin/miniconda3/envs/xarray_dev/bin/python cachedir: .pytest_cache rootdir: /yin_raid/xin/github/xarray, inifile: setup.cfg collected 2 items

test_interp.py::test_nans[True] SKIPPED [ 50%] test_interp.py::test_nans[False] PASSED [100%]

================================================ warnings summary ================================================= xarray/tests/test_interp.py::test_nans[False] xarray/tests/test_interp.py::test_nans[False] /yin_raid/xin/miniconda3/envs/xarray_dev/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject return f(args, *kwds)

-- Docs: https://docs.pytest.org/en/latest/warnings.html ==================================== 1 passed, 1 skipped, 2 warnings in 0.88s ===================================== ```

I will update the test and pull it soon.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Coordinates passed to interp have nan values 591643901
612548656 https://github.com/pydata/xarray/pull/3924#issuecomment-612548656 https://api.github.com/repos/pydata/xarray/issues/3924 MDEyOklzc3VlQ29tbWVudDYxMjU0ODY1Ng== zxdawn 30388627 2020-04-12T01:38:32Z 2020-04-12T01:38:32Z NONE

It's my first time to write test for xarray. I tried this method: pytest test_interp.py::test_nans -v, but all tests are skipped: ``` =================================================== test session starts ==================================================== platform linux -- Python 3.7.6, pytest-5.4.1, py-1.8.1, pluggy-0.12.0 -- /yin_raid/xin/miniconda3/envs/xarray_dev/bin/python cachedir: .pytest_cache rootdir: /yin_raid/xin/github/xarray, inifile: setup.cfg collected 2 items

test_interp.py::test_nans[True] SKIPPED [ 50%] test_interp.py::test_nans[False] SKIPPED [100%]

==================================================== 2 skipped in 0.50s ==================================================== ```

How to make the test actually run?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Coordinates passed to interp have nan values 591643901
611543489 https://github.com/pydata/xarray/issues/2283#issuecomment-611543489 https://api.github.com/repos/pydata/xarray/issues/2283 MDEyOklzc3VlQ29tbWVudDYxMTU0MzQ4OQ== zxdawn 30388627 2020-04-09T13:59:34Z 2020-04-09T13:59:34Z NONE

Any update? This issue could result in errors for many functions of xarray, like interp and dot.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Exact alignment should allow missing dimension coordinates 340733448
611483929 https://github.com/pydata/xarray/issues/3957#issuecomment-611483929 https://api.github.com/repos/pydata/xarray/issues/3957 MDEyOklzc3VlQ29tbWVudDYxMTQ4MzkyOQ== zxdawn 30388627 2020-04-09T11:43:51Z 2020-04-09T11:43:51Z NONE

I need to use df.index = pd.MultiIndex.from_arrays(.....). See https://github.com/pandas-dev/pandas/issues/33420

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sort DataArray by data values along one dim 596606599
611348892 https://github.com/pydata/xarray/issues/3957#issuecomment-611348892 https://api.github.com/repos/pydata/xarray/issues/3957 MDEyOklzc3VlQ29tbWVudDYxMTM0ODg5Mg== zxdawn 30388627 2020-04-09T06:13:07Z 2020-04-09T06:13:07Z NONE

@JavierRuano When the dataframe is converted back to dataset, the values aren't changed because of the unchanged Multiindex in dataframe ... I have tried this: df = ds.to_dataframe() new_df = df.sort_values(by=['x', 'y', 'cld']) new_df.index.set_levels(list(np.arange(ds['cld'].sizes['z'])), level='z', inplace=True) But, it doesn't work. Still trying ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sort DataArray by data values along one dim 596606599
611299453 https://github.com/pydata/xarray/issues/3957#issuecomment-611299453 https://api.github.com/repos/pydata/xarray/issues/3957 MDEyOklzc3VlQ29tbWVudDYxMTI5OTQ1Mw== zxdawn 30388627 2020-04-09T02:54:30Z 2020-04-09T02:54:30Z NONE

@JavierRuano Nice suggestion! I combine them to dataset, convert it to dataframe and then sort_values. Finally, convert the dataframe back to dataset: ``` ds = cld.to_dataset(name='cld') ds['pair'] = pair

df = ds.to_dataframe() new_ds = df.sort_values(by='cld').to_xarray().transpose() ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sort DataArray by data values along one dim 596606599
611291129 https://github.com/pydata/xarray/issues/3957#issuecomment-611291129 https://api.github.com/repos/pydata/xarray/issues/3957 MDEyOklzc3VlQ29tbWVudDYxMTI5MTEyOQ== zxdawn 30388627 2020-04-09T02:22:33Z 2020-04-09T02:22:33Z NONE

@JavierRuano Thank you very much. This example is a special case. If the order of z is different for each x and y, do we need to create a tmp DataArray to save the result of looping x and y?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sort DataArray by data values along one dim 596606599
610808232 https://github.com/pydata/xarray/issues/3955#issuecomment-610808232 https://api.github.com/repos/pydata/xarray/issues/3955 MDEyOklzc3VlQ29tbWVudDYxMDgwODIzMg== zxdawn 30388627 2020-04-08T07:51:41Z 2020-04-08T07:51:41Z NONE

@kmuehlbauer Thanks, Nice trick! It works well for this situation.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Masking and preserving int type 596352097
610707081 https://github.com/pydata/xarray/issues/3954#issuecomment-610707081 https://api.github.com/repos/pydata/xarray/issues/3954 MDEyOklzc3VlQ29tbWVudDYxMDcwNzA4MQ== zxdawn 30388627 2020-04-08T01:52:30Z 2020-04-08T01:52:30Z NONE

Thanks, @fujiisoup . @dcherian decided to improve the error message later. So, I will leave this open.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Concatenate 3D array with 2D array 596249070
610420611 https://github.com/pydata/xarray/issues/3949#issuecomment-610420611 https://api.github.com/repos/pydata/xarray/issues/3949 MDEyOklzc3VlQ29tbWVudDYxMDQyMDYxMQ== zxdawn 30388627 2020-04-07T14:30:12Z 2020-04-07T14:30:12Z NONE

@johnomotani Thanks, it works. Sorry for this simple question ...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Index 3D array with index of last axis stored in 2D array 595900209
610142267 https://github.com/pydata/xarray/issues/3941#issuecomment-610142267 https://api.github.com/repos/pydata/xarray/issues/3941 MDEyOklzc3VlQ29tbWVudDYxMDE0MjI2Nw== zxdawn 30388627 2020-04-07T02:45:26Z 2020-04-07T02:45:26Z NONE

@dcherian Sorry for the misunderstanding. I tried again for the 3d array, it works well ;) ``` import xarray as xr import numpy as np

x = 2 y = 4 z = 3 data = np.arange(xyz).reshape(z, x, y)

input array

a = xr.DataArray(data, dims=['z', 'y', 'x'])

start_index array

sindex = xr.DataArray(np.full_like(a[0, ...], 0), dims=['y', 'x'])

end_index array

eindex = xr.DataArray(np.full_like(a[0, ...], 1), dims=['y', 'x'])

zindex = a.z.copy(data=np.arange(a.sizes["z"]))

sub_z = (zindex >= sindex) & (zindex <= eindex) sum_a = a.where(sub_z).sum('z', keepdims=True)

print(a) print(sum_a) ```

``` <xarray.DataArray (z: 3, y: 2, x: 4)> array([[[ 0, 1, 2, 3], [ 4, 5, 6, 7]],

   [[ 8,  9, 10, 11],
    [12, 13, 14, 15]],

   [[16, 17, 18, 19],
    [20, 21, 22, 23]]])

Dimensions without coordinates: z, y, x

<xarray.DataArray (z: 1, y: 2, x: 4)> array([[[ 8., 10., 12., 14.], [16., 18., 20., 22.]]]) Dimensions without coordinates: z, y, x ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sum based on start_index and end_index array 594900245
609850841 https://github.com/pydata/xarray/issues/3941#issuecomment-609850841 https://api.github.com/repos/pydata/xarray/issues/3941 MDEyOklzc3VlQ29tbWVudDYwOTg1MDg0MQ== zxdawn 30388627 2020-04-06T15:04:23Z 2020-04-06T15:04:23Z NONE

@dcherian Excellent solution!

If we upgrade this to 3d array and sum by z axis, it seems that method isn't suitable:

``` import xarray as xr import numpy as np

x = 2 y = 2 z = 3 data = np.arange(xyz).reshape(z, y, x)

input array

a = xr.DataArray(data, dims=['z', 'y', 'x'])

start_index array

sindex = xr.DataArray(np.full_like(a[0, ...], 0), dims=['y', 'x'])

end_index array

eindex = xr.DataArray(np.full_like(a[0, ...], 1), dims=['y', 'x']) ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sum based on start_index and end_index array 594900245
609728948 https://github.com/pydata/xarray/issues/3941#issuecomment-609728948 https://api.github.com/repos/pydata/xarray/issues/3941 MDEyOklzc3VlQ29tbWVudDYwOTcyODk0OA== zxdawn 30388627 2020-04-06T11:09:10Z 2020-04-06T11:09:10Z NONE

Solution (Boolean)

```

stack indexes

index_list = np.column_stack((sindex, eindex))

all false array

boolean_array = np.zeros(a.shape, dtype=bool)

iterate and assign true

for row in range(len(index_list)): boolean_array[row, np.arange(index_list[row][0], index_list[row][1]+1)] = True

sum_a = a.where(boolean_array).sum(dim='y') ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Sum based on start_index and end_index array 594900245
609528106 https://github.com/pydata/xarray/pull/3924#issuecomment-609528106 https://api.github.com/repos/pydata/xarray/issues/3924 MDEyOklzc3VlQ29tbWVudDYwOTUyODEwNg== zxdawn 30388627 2020-04-06T01:57:54Z 2020-04-06T02:03:41Z NONE

@spencerkclark Maybe converting the datetime into number? BTW, Why not forcing numpy >= 1.18.1?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Coordinates passed to interp have nan values 591643901
609040104 https://github.com/pydata/xarray/issues/3931#issuecomment-609040104 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTA0MDEwNA== zxdawn 30388627 2020-04-04T14:51:32Z 2020-04-04T14:51:32Z NONE

@mathause Thanks! Shall we close this issue?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609038408 https://github.com/pydata/xarray/issues/3931#issuecomment-609038408 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAzODQwOA== zxdawn 30388627 2020-04-04T14:39:27Z 2020-04-04T14:39:27Z NONE

@mathause For .values, if I delete vectorize=True, I got this error: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/scipy/interpolate/interpolate.py", line 455, in __init__ raise ValueError("the x array must have exactly one dimension.") ValueError: the x array must have exactly one dimension. Then, I keep vectorize=True deleted and use the np.interp, I got this error: File "/mnt/d/Github/s5p-wrfchem/s5p_utils.py", line 264, in interp1d_np return np.interp(xi, x, data) File "<__array_function__ internals>", line 6, in interp File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1412, in interp return interp_func(x, xp, fp, left, right) ValueError: object too deep for desired array If I let vectorize=True shows again and use the np.interp, I got the error mentioned before: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 2 vs 39

For the one without .values, this is the result of repr(s5p['p']): <xarray.DataArray (bottom_top: 25, y: 389, x: 450)> dask.array<where, shape=(25, 389, 450), dtype=float32, chunksize=(25, 389, 450), chunktype=numpy.ndarray> Coordinates: * bottom_top (bottom_top) int32 0 1 2 3 4 5 6 7 8 ... 17 18 19 20 21 22 23 24 vertices int32 0 crs object +proj=latlong +datum=WGS84 +ellps=WGS84 +type=crs Dimensions without coordinates: y, x Attributes: name: p resolution: None calibration: None polarization: None level: None modifiers: () units: hPa After the bottom_up in renamed to new_dim, it works without error for both scipy and numpy interpolation function.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609031899 https://github.com/pydata/xarray/issues/3931#issuecomment-609031899 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAzMTg5OQ== zxdawn 30388627 2020-04-04T13:52:28Z 2020-04-04T13:52:28Z NONE

I tested again with a subset of my data: subset_no2 = regrid_vars['no2'].isel(x=277, y=[212, 213]) subset_p = regrid_vars['p'].isel(x=277, y=[212, 213]) subset_interp = s5p['p'].isel(x=277, y=[212, 213]) interped = xr.apply_ufunc( interp1d_np, subset_no2, subset_p, subset_interp, input_core_dims=[["bottom_top"], ["bottom_top"], ["new_dim"]], output_core_dims=[["new_dim"]], exclude_dims=set(("bottom_top",)), vectorize=True, )

Error without .values: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/xarray/core/computation.py", line 508, in broadcast_compat_data list(core_dims), missing_core_dims ValueError: operand to apply_ufunc has required core dimensions ['new_dim'], but some of these dimensions are absent on an input variable: ['new_dim']

Error with .values: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 2 vs 39

Details of DataArray:

## subset_no2 <xarray.DataArray 'no2' (bottom_top: 39, y: 2)> array([[1.24115179e-08, 6.27056852e-08], [6.80964068e-09, 4.52237474e-08], [4.69188675e-09, 2.54678234e-08], [3.53337218e-09, 1.65583661e-08], [2.94962740e-09, 1.59282658e-08], [2.59346789e-09, 1.18680378e-08], [2.20434986e-09, 6.98941734e-09], [1.70838029e-09, 4.09148835e-09], [1.08785037e-09, 2.11626991e-09], [5.40526199e-10, 7.51218841e-10], [3.40114302e-10, 2.83674335e-10], [2.25290863e-10, 2.03432518e-10], [1.88406983e-10, 1.77420169e-10], [1.64951814e-10, 1.58818626e-10], [1.32610296e-10, 1.46572637e-10], [1.07792915e-10, 1.38499777e-10], [9.41847784e-11, 9.92248621e-11], [8.43529921e-11, 7.64672477e-11], [8.50483741e-11, 6.09330335e-11], [9.88087134e-11, 7.22940627e-11], [1.12557403e-10, 8.70426616e-11], [1.26527656e-10, 1.12620613e-10], [1.18148820e-10, 1.52514333e-10], [1.14522875e-10, 2.64312333e-10], [1.08898568e-10, 4.51579313e-10], [7.86399974e-11, 4.47694522e-10], [4.73609487e-11, 3.14831089e-10], [4.00449127e-11, 2.01112967e-10], [6.23887273e-11, 1.39728893e-10], [8.12143663e-11, 1.09831490e-10], [7.69666632e-11, 8.47591237e-11], [6.62737034e-11, 6.67154422e-11], [7.04659314e-11, 6.81855965e-11], [8.89134542e-11, 8.27209545e-11], [1.14639174e-10, 1.24251589e-10], [1.39306685e-10, 1.77576530e-10], [1.87629863e-10, 2.37522657e-10], [2.79661049e-10, 3.35704699e-10], [3.84697368e-10, 4.34654679e-10]]) Coordinates: XTIME datetime64[ns] 2019-07-25T05:40:00 lon (y) float32 118.88653 118.87 lat (y) float32 31.982988 32.046158 Dimensions without coordinates: bottom_top, y ## subset_p <xarray.DataArray (bottom_top: 39, y: 2)> array([[999.21183185, 994.82226662], [992.45297279, 988.09617577], [983.90273668, 979.58676312], [973.14155175, 968.88817802], [959.73882983, 955.55701426], [943.2266928 , 939.13366778], [923.14843372, 919.16002955], [899.1301363 , 895.27449236], [870.93359135, 867.24191033], [838.54076775, 835.04477768], [802.19838977, 798.92594777], [762.42839118, 759.41125882], [720.01658748, 717.276933 ], [675.82211003, 673.37656836], [630.36177484, 628.21954216], [583.89080793, 582.06254511], [536.71087969, 535.2179208 ], [489.22426113, 488.04157991], [442.01029323, 441.13686917], [397.48824388, 396.89283447], [357.43902179, 357.05545246], [321.40740822, 321.16476787], [288.98307787, 288.8348624 ], [259.79715824, 259.73242936], [233.52221354, 233.53890789], [209.88217625, 209.9574665 ], [188.6518575 , 188.74680403], [169.61437427, 169.67585118], [152.5459371 , 152.54166587], [137.21135599, 137.1660674 ], [123.42544258, 123.36597354], [111.02212197, 110.9501009 ], [ 99.84275351, 99.7735498 ], [ 89.78023477, 89.72146162], [ 80.73068588, 80.68572074], [ 72.598215 , 72.56306462], [ 65.28822276, 65.25848141], [ 58.71494192, 58.69333156], [ 52.80223723, 52.79301171]]) Coordinates: XTIME datetime64[ns] 2019-07-25T05:40:00 lon (y) float32 118.88653 118.87 lat (y) float32 31.982988 32.046158 Dimensions without coordinates: bottom_top, y ## subset_interp <xarray.DataArray (bottom_top: 25, y: 2)> dask.array<getitem, shape=(25, 2), dtype=float32, chunksize=(25, 2), chunktype=numpy.ndarray> Coordinates: * bottom_top (bottom_top) int32 0 1 2 3 4 5 6 7 8 ... 17 18 19 20 21 22 23 24 vertices int32 0 crs object +proj=latlong +datum=WGS84 +ellps=WGS84 +type=crs Dimensions without coordinates: y Attributes: name: p resolution: None calibration: None polarization: None level: None modifiers: () units: hPa

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609027716 https://github.com/pydata/xarray/issues/3931#issuecomment-609027716 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAyNzcxNg== zxdawn 30388627 2020-04-04T13:22:31Z 2020-04-04T13:23:39Z NONE

@dcherian If .values is removed, I got this error: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/xarray/core/computation.py", line 508, in broadcast_compat_data list(core_dims), missing_core_dims ValueError: operand to apply_ufunc has required core dimensions ['new_dim'], but some of these dimensions are absent on an input variable: ['new_dim']

Here's the information of regrid_vars['no2'], regrid_vars['p'] and s5p['p']: <xarray.DataArray 'no2' (bottom_top: 39, y: 389, x: 450)> <xarray.DataArray (bottom_top: 39, y: 389, x: 450)> <xarray.DataArray (bottom_top: 25, y: 389, x: 450)>

BTW, I have nan values in regrid_vars['no2'] and regrid_vars['p']. I think that wouldn't cause that error.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609023296 https://github.com/pydata/xarray/issues/3931#issuecomment-609023296 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAyMzI5Ng== zxdawn 30388627 2020-04-04T12:45:03Z 2020-04-04T12:50:12Z NONE

@mathause Thanks! It works well. Here's the solution:

Code

``` def interp1d_np(data, x, xi): from scipy import interpolate # return np.interp(xi, x, data) f = interpolate.interp1d(x, data, fill_value='extrapolate') return f(xi)

interped = xr.apply_ufunc( interp1d_np, # first the function bottom_up, # now arguments in the order expected by 'interp1_np' pressure.values, # as above interp_p.values, # as above input_core_dims=[["z"], ["z"], ["new_z"]], # list with one entry per arg output_core_dims=[["new_z"]], # returned data has one dimension exclude_dims=set(("z",)), # dimensions allowed to change size. Must be a set! vectorize=True, # loop over non-core dims ) interped = interped.rename({"new_z": "z"})

print(np.testing.assert_allclose(output.values, interped.values)) ```

Result:

None

However, when I apply it to my real data, I got some errors:

Code

``` def interp1d_np(data, x, xi): from scipy import interpolate f = interpolate.interp1d(x, data, fill_value='extrapolate') return f(xi)

interped = xr.apply_ufunc(
    interp1d_np,
    regrid_vars['no2'],
    regrid_vars['p'].values,
    s5p['p'].values,
    input_core_dims=[["bottom_top"], ["bottom_top"], ["new_dim"]],
    output_core_dims=[["new_dim"]],
    exclude_dims=set(("bottom_top",)),
    vectorize=True,
)

```

Error:

File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 450 vs 39 Here's the output of print(regrid_vars['no2'].shape, regrid_vars['p'].values.shape, s5p['p'].values.shape): (39, 389, 450) (39, 389, 450) (25, 389, 450) The shape looks fine.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
608106094 https://github.com/pydata/xarray/pull/3924#issuecomment-608106094 https://api.github.com/repos/pydata/xarray/issues/3924 MDEyOklzc3VlQ29tbWVudDYwODEwNjA5NA== zxdawn 30388627 2020-04-02T21:43:41Z 2020-04-02T21:43:41Z NONE

Hi @max-sixty, thanks. If this looks well, I'm glad to add the test for it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Coordinates passed to interp have nan values 591643901
542599383 https://github.com/pydata/xarray/issues/3407#issuecomment-542599383 https://api.github.com/repos/pydata/xarray/issues/3407 MDEyOklzc3VlQ29tbWVudDU0MjU5OTM4Mw== zxdawn 30388627 2019-10-16T08:55:01Z 2019-10-16T08:55:01Z NONE

@DocOtak Thank you for your explanation! It works well now :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Save 'S1' array without the char_dim_name dimension 507658070
529164506 https://github.com/pydata/xarray/issues/3290#issuecomment-529164506 https://api.github.com/repos/pydata/xarray/issues/3290 MDEyOklzc3VlQ29tbWVudDUyOTE2NDUwNg== zxdawn 30388627 2019-09-08T02:52:56Z 2019-09-08T02:52:56Z NONE

@shoyer Thanks. It's not datetime64 arrays, this is the result of np.isnat(t): File "/public/software/anaconda/anaconda3/envs/python36/lib/python3.6/site-packages/xarray-0.12.3-py3.6.egg/xarray/core/arithmetic.py", line 69, in __array_ufunc__ dask='allowed') File "/public/software/anaconda/anaconda3/envs/python36/lib/python3.6/site-packages/xarray-0.12.3-py3.6.egg/xarray/core/computation.py", line 969, in apply_ufunc keep_attrs=keep_attrs) File "/public/software/anaconda/anaconda3/envs/python36/lib/python3.6/site-packages/xarray-0.12.3-py3.6.egg/xarray/core/computation.py", line 217, in apply_dataarray_vfunc result_var = func(*data_vars) File "/public/software/anaconda/anaconda3/envs/python36/lib/python3.6/site-packages/xarray-0.12.3-py3.6.egg/xarray/core/computation.py", line 564, in apply_variable_ufunc result_data = func(*input_data) TypeError: ufunc 'isnat' is only defined for datetime and timedelta.

I use pd.isnull(t).all() to check it, it works. Actually it's all nan. There's something wrong with the nc file, I will contact the data center. Thank you for all your help :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Using min() with skipna=True 490593787
529163827 https://github.com/pydata/xarray/issues/3290#issuecomment-529163827 https://api.github.com/repos/pydata/xarray/issues/3290 MDEyOklzc3VlQ29tbWVudDUyOTE2MzgyNw== zxdawn 30388627 2019-09-08T02:39:30Z 2019-09-08T02:39:30Z NONE

@keewis I tried to using np.isnan(t.values).all() to check whether it's all nan. But, I got this error: print (np.isnan(t.values).all()) TypeError: ufunc 'isnan' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe'' This is the type of t.values: <class 'numpy.ndarray'>

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Using min() with skipna=True 490593787
529163060 https://github.com/pydata/xarray/issues/3290#issuecomment-529163060 https://api.github.com/repos/pydata/xarray/issues/3290 MDEyOklzc3VlQ29tbWVudDUyOTE2MzA2MA== zxdawn 30388627 2019-09-08T02:22:01Z 2019-09-08T02:22:01Z NONE

@shoyer Thank. It works now. But, I get another question. This is the result of t = ds['time_utc']: <xarray.DataArray 'time_utc' (time: 1, scanline: 357, ground_pixel: 450)> array([[[nan, nan, ..., nan, nan], [nan, nan, ..., nan, nan], ..., [nan, nan, ..., nan, nan], [nan, nan, ..., nan, nan]]], dtype=object) Coordinates: * scanline (scanline) float64 1.0 2.0 3.0 4.0 ... 354.0 355.0 356.0 357.0 * ground_pixel (ground_pixel) float64 1.0 2.0 3.0 4.0 ... 448.0 449.0 450.0 * time (time) datetime64[ns] 2019-08-25 Attributes: long_name: Time of observation as ISO 8601 date-time string If I want to get the minimum value by t.min(skipna=True), I get the strange type: <xarray.DataArray 'time_utc' ()> array(<xarray.core.dtypes.AlwaysGreaterThan object at 0x7f96ac188550>, dtype=object) Can't convert it to string by str(t.min(skipna=True)).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Using min() with skipna=True 490593787
529091168 https://github.com/pydata/xarray/issues/3290#issuecomment-529091168 https://api.github.com/repos/pydata/xarray/issues/3290 MDEyOklzc3VlQ29tbWVudDUyOTA5MTE2OA== zxdawn 30388627 2019-09-07T09:33:32Z 2019-09-07T09:33:32Z NONE

@max-sixty Actually, I'm using numpy = 1.13.1 and I need skipna= True. Don't understand the error it shows.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Using min() with skipna=True 490593787
527422655 https://github.com/pydata/xarray/issues/3275#issuecomment-527422655 https://api.github.com/repos/pydata/xarray/issues/3275 MDEyOklzc3VlQ29tbWVudDUyNzQyMjY1NQ== zxdawn 30388627 2019-09-03T11:42:00Z 2019-09-03T11:42:00Z NONE

@dcherian Can't find how to do that by rcParams. Could you give a simple example?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Change the label size and tick label size of colorbar 488190500
527150960 https://github.com/pydata/xarray/issues/3275#issuecomment-527150960 https://api.github.com/repos/pydata/xarray/issues/3275 MDEyOklzc3VlQ29tbWVudDUyNzE1MDk2MA== zxdawn 30388627 2019-09-02T13:36:18Z 2019-09-02T13:36:18Z NONE

@dcherian Thanks! Figure out now:

``` import xarray as xr import matplotlib.pyplot as plt

airtemps = xr.tutorial.open_dataset('air_temperature') air = airtemps.air - 273.15 air2d = air.isel(time=500)

im = air2d.plot.pcolormesh(add_colorbar=False) cb = plt.colorbar(im, orientation="horizontal", pad=0.15) cb.set_label(label='Temperature ($^{\circ}$C)', size='large', weight='bold') cb.ax.tick_params(labelsize='large') ```

{
    "total_count": 6,
    "+1": 6,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Change the label size and tick label size of colorbar 488190500
450458650 https://github.com/pydata/xarray/issues/2636#issuecomment-450458650 https://api.github.com/repos/pydata/xarray/issues/2636 MDEyOklzc3VlQ29tbWVudDQ1MDQ1ODY1MA== zxdawn 30388627 2018-12-29T02:38:00Z 2018-12-29T02:39:41Z NONE

@dcherian It works by netCDF4, but not for xarray: ``` file = Dataset('ds1.nc') print (file.variables['time'],'\n')

with xr.open_dataset('ds1.nc') as f: print (f.time.attrs) Output: <class 'netCDF4._netCDF4.Variable'> int64 time(time) units: hours since 2015-01-01 unlimited dimensions: current shape = (3,) filling on, default _FillValue of -9223372036854775806 used

OrderedDict() ```

What's the difference between units and attrs?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset change the attributes of Coordinates 394625579
450456616 https://github.com/pydata/xarray/issues/2636#issuecomment-450456616 https://api.github.com/repos/pydata/xarray/issues/2636 MDEyOklzc3VlQ29tbWVudDQ1MDQ1NjYxNg== zxdawn 30388627 2018-12-29T02:17:49Z 2018-12-29T02:17:49Z NONE

@xylar Thanks! I just found another question similar to this one.

I've tried some operations: with xr.open_dataset('merge.nc') as f: print (f['temperature'],'\n') print ('---------------------------') print (f.mean(dim='time')) print ('---------------------------') print (f['temperature'].loc[:,:,'2015-01-05T04:00:00',]) print ('---------------------------')

It works fine: ``` <xarray.DataArray 'temperature' (x: 2, y: 2, time: 6)> array([[[-0.022611, -1.428088, -0.655508, 0.977389, -0.428088, 0.344492], [ 0.430102, 0.996973, -0.882054, 1.430102, 1.996973, 0.117946]],

   [[ 0.157233, -0.230397, -0.505775,  1.157233,  0.769603,  0.494225],
    [-0.075826, -1.933904, -0.823982,  0.924174, -0.933904,  0.176018]]])

Coordinates: lon (x, y) float64 ... lat (x, y) float64 ... * time (time) datetime64[ns] 2015-01-05T04:00:00 2015-01-05T05:00:00 ... Dimensions without coordinates: x, y


<xarray.Dataset> Dimensions: (x: 2, y: 2) Coordinates: lon (x, y) float64 ... lat (x, y) float64 ... Dimensions without coordinates: x, y Data variables: temperature (x, y) float64 -0.2021 0.6817 0.307 -0.4446


<xarray.DataArray 'temperature' (x: 2, y: 2)> array([[-0.022611, 0.430102], [ 0.157233, -0.075826]]) Coordinates: lon (x, y) float64 ... lat (x, y) float64 ... time datetime64[ns] 2015-01-05T04:00:00 Dimensions without coordinates: x, y


```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  open_mfdataset change the attributes of Coordinates 394625579

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 23.632ms · About: xarray-datasette