home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

5 rows where issue = 593770078 and user = 30388627 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 1

  • zxdawn · 5 ✖

issue 1

  • Interpolate 3D array by another 3D array · 5 ✖

author_association 1

  • NONE 5
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
609040104 https://github.com/pydata/xarray/issues/3931#issuecomment-609040104 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTA0MDEwNA== zxdawn 30388627 2020-04-04T14:51:32Z 2020-04-04T14:51:32Z NONE

@mathause Thanks! Shall we close this issue?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609038408 https://github.com/pydata/xarray/issues/3931#issuecomment-609038408 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAzODQwOA== zxdawn 30388627 2020-04-04T14:39:27Z 2020-04-04T14:39:27Z NONE

@mathause For .values, if I delete vectorize=True, I got this error: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/scipy/interpolate/interpolate.py", line 455, in __init__ raise ValueError("the x array must have exactly one dimension.") ValueError: the x array must have exactly one dimension. Then, I keep vectorize=True deleted and use the np.interp, I got this error: File "/mnt/d/Github/s5p-wrfchem/s5p_utils.py", line 264, in interp1d_np return np.interp(xi, x, data) File "<__array_function__ internals>", line 6, in interp File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1412, in interp return interp_func(x, xp, fp, left, right) ValueError: object too deep for desired array If I let vectorize=True shows again and use the np.interp, I got the error mentioned before: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 2 vs 39

For the one without .values, this is the result of repr(s5p['p']): <xarray.DataArray (bottom_top: 25, y: 389, x: 450)> dask.array<where, shape=(25, 389, 450), dtype=float32, chunksize=(25, 389, 450), chunktype=numpy.ndarray> Coordinates: * bottom_top (bottom_top) int32 0 1 2 3 4 5 6 7 8 ... 17 18 19 20 21 22 23 24 vertices int32 0 crs object +proj=latlong +datum=WGS84 +ellps=WGS84 +type=crs Dimensions without coordinates: y, x Attributes: name: p resolution: None calibration: None polarization: None level: None modifiers: () units: hPa After the bottom_up in renamed to new_dim, it works without error for both scipy and numpy interpolation function.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609031899 https://github.com/pydata/xarray/issues/3931#issuecomment-609031899 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAzMTg5OQ== zxdawn 30388627 2020-04-04T13:52:28Z 2020-04-04T13:52:28Z NONE

I tested again with a subset of my data: subset_no2 = regrid_vars['no2'].isel(x=277, y=[212, 213]) subset_p = regrid_vars['p'].isel(x=277, y=[212, 213]) subset_interp = s5p['p'].isel(x=277, y=[212, 213]) interped = xr.apply_ufunc( interp1d_np, subset_no2, subset_p, subset_interp, input_core_dims=[["bottom_top"], ["bottom_top"], ["new_dim"]], output_core_dims=[["new_dim"]], exclude_dims=set(("bottom_top",)), vectorize=True, )

Error without .values: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/xarray/core/computation.py", line 508, in broadcast_compat_data list(core_dims), missing_core_dims ValueError: operand to apply_ufunc has required core dimensions ['new_dim'], but some of these dimensions are absent on an input variable: ['new_dim']

Error with .values: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 2 vs 39

Details of DataArray:

## subset_no2 <xarray.DataArray 'no2' (bottom_top: 39, y: 2)> array([[1.24115179e-08, 6.27056852e-08], [6.80964068e-09, 4.52237474e-08], [4.69188675e-09, 2.54678234e-08], [3.53337218e-09, 1.65583661e-08], [2.94962740e-09, 1.59282658e-08], [2.59346789e-09, 1.18680378e-08], [2.20434986e-09, 6.98941734e-09], [1.70838029e-09, 4.09148835e-09], [1.08785037e-09, 2.11626991e-09], [5.40526199e-10, 7.51218841e-10], [3.40114302e-10, 2.83674335e-10], [2.25290863e-10, 2.03432518e-10], [1.88406983e-10, 1.77420169e-10], [1.64951814e-10, 1.58818626e-10], [1.32610296e-10, 1.46572637e-10], [1.07792915e-10, 1.38499777e-10], [9.41847784e-11, 9.92248621e-11], [8.43529921e-11, 7.64672477e-11], [8.50483741e-11, 6.09330335e-11], [9.88087134e-11, 7.22940627e-11], [1.12557403e-10, 8.70426616e-11], [1.26527656e-10, 1.12620613e-10], [1.18148820e-10, 1.52514333e-10], [1.14522875e-10, 2.64312333e-10], [1.08898568e-10, 4.51579313e-10], [7.86399974e-11, 4.47694522e-10], [4.73609487e-11, 3.14831089e-10], [4.00449127e-11, 2.01112967e-10], [6.23887273e-11, 1.39728893e-10], [8.12143663e-11, 1.09831490e-10], [7.69666632e-11, 8.47591237e-11], [6.62737034e-11, 6.67154422e-11], [7.04659314e-11, 6.81855965e-11], [8.89134542e-11, 8.27209545e-11], [1.14639174e-10, 1.24251589e-10], [1.39306685e-10, 1.77576530e-10], [1.87629863e-10, 2.37522657e-10], [2.79661049e-10, 3.35704699e-10], [3.84697368e-10, 4.34654679e-10]]) Coordinates: XTIME datetime64[ns] 2019-07-25T05:40:00 lon (y) float32 118.88653 118.87 lat (y) float32 31.982988 32.046158 Dimensions without coordinates: bottom_top, y ## subset_p <xarray.DataArray (bottom_top: 39, y: 2)> array([[999.21183185, 994.82226662], [992.45297279, 988.09617577], [983.90273668, 979.58676312], [973.14155175, 968.88817802], [959.73882983, 955.55701426], [943.2266928 , 939.13366778], [923.14843372, 919.16002955], [899.1301363 , 895.27449236], [870.93359135, 867.24191033], [838.54076775, 835.04477768], [802.19838977, 798.92594777], [762.42839118, 759.41125882], [720.01658748, 717.276933 ], [675.82211003, 673.37656836], [630.36177484, 628.21954216], [583.89080793, 582.06254511], [536.71087969, 535.2179208 ], [489.22426113, 488.04157991], [442.01029323, 441.13686917], [397.48824388, 396.89283447], [357.43902179, 357.05545246], [321.40740822, 321.16476787], [288.98307787, 288.8348624 ], [259.79715824, 259.73242936], [233.52221354, 233.53890789], [209.88217625, 209.9574665 ], [188.6518575 , 188.74680403], [169.61437427, 169.67585118], [152.5459371 , 152.54166587], [137.21135599, 137.1660674 ], [123.42544258, 123.36597354], [111.02212197, 110.9501009 ], [ 99.84275351, 99.7735498 ], [ 89.78023477, 89.72146162], [ 80.73068588, 80.68572074], [ 72.598215 , 72.56306462], [ 65.28822276, 65.25848141], [ 58.71494192, 58.69333156], [ 52.80223723, 52.79301171]]) Coordinates: XTIME datetime64[ns] 2019-07-25T05:40:00 lon (y) float32 118.88653 118.87 lat (y) float32 31.982988 32.046158 Dimensions without coordinates: bottom_top, y ## subset_interp <xarray.DataArray (bottom_top: 25, y: 2)> dask.array<getitem, shape=(25, 2), dtype=float32, chunksize=(25, 2), chunktype=numpy.ndarray> Coordinates: * bottom_top (bottom_top) int32 0 1 2 3 4 5 6 7 8 ... 17 18 19 20 21 22 23 24 vertices int32 0 crs object +proj=latlong +datum=WGS84 +ellps=WGS84 +type=crs Dimensions without coordinates: y Attributes: name: p resolution: None calibration: None polarization: None level: None modifiers: () units: hPa

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609027716 https://github.com/pydata/xarray/issues/3931#issuecomment-609027716 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAyNzcxNg== zxdawn 30388627 2020-04-04T13:22:31Z 2020-04-04T13:23:39Z NONE

@dcherian If .values is removed, I got this error: File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/xarray/core/computation.py", line 508, in broadcast_compat_data list(core_dims), missing_core_dims ValueError: operand to apply_ufunc has required core dimensions ['new_dim'], but some of these dimensions are absent on an input variable: ['new_dim']

Here's the information of regrid_vars['no2'], regrid_vars['p'] and s5p['p']: <xarray.DataArray 'no2' (bottom_top: 39, y: 389, x: 450)> <xarray.DataArray (bottom_top: 39, y: 389, x: 450)> <xarray.DataArray (bottom_top: 25, y: 389, x: 450)>

BTW, I have nan values in regrid_vars['no2'] and regrid_vars['p']. I think that wouldn't cause that error.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078
609023296 https://github.com/pydata/xarray/issues/3931#issuecomment-609023296 https://api.github.com/repos/pydata/xarray/issues/3931 MDEyOklzc3VlQ29tbWVudDYwOTAyMzI5Ng== zxdawn 30388627 2020-04-04T12:45:03Z 2020-04-04T12:50:12Z NONE

@mathause Thanks! It works well. Here's the solution:

Code

``` def interp1d_np(data, x, xi): from scipy import interpolate # return np.interp(xi, x, data) f = interpolate.interp1d(x, data, fill_value='extrapolate') return f(xi)

interped = xr.apply_ufunc( interp1d_np, # first the function bottom_up, # now arguments in the order expected by 'interp1_np' pressure.values, # as above interp_p.values, # as above input_core_dims=[["z"], ["z"], ["new_z"]], # list with one entry per arg output_core_dims=[["new_z"]], # returned data has one dimension exclude_dims=set(("z",)), # dimensions allowed to change size. Must be a set! vectorize=True, # loop over non-core dims ) interped = interped.rename({"new_z": "z"})

print(np.testing.assert_allclose(output.values, interped.values)) ```

Result:

None

However, when I apply it to my real data, I got some errors:

Code

``` def interp1d_np(data, x, xi): from scipy import interpolate f = interpolate.interp1d(x, data, fill_value='extrapolate') return f(xi)

interped = xr.apply_ufunc(
    interp1d_np,
    regrid_vars['no2'],
    regrid_vars['p'].values,
    s5p['p'].values,
    input_core_dims=[["bottom_top"], ["bottom_top"], ["new_dim"]],
    output_core_dims=[["new_dim"]],
    exclude_dims=set(("bottom_top",)),
    vectorize=True,
)

```

Error:

File "/home/xin/miniconda3/envs/satpy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 1830, in _update_dim_sizes % (dim, size, dim_sizes[dim])) ValueError: inconsistent size for core dimension 'dim0': 450 vs 39 Here's the output of print(regrid_vars['no2'].shape, regrid_vars['p'].values.shape, s5p['p'].values.shape): (39, 389, 450) (39, 389, 450) (25, 389, 450) The shape looks fine.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Interpolate 3D array by another 3D array 593770078

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.443ms · About: xarray-datasette