home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

4 rows where type = "issue" and user = 25473287 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 2
  • open 2

type 1

  • issue · 4 ✖

repo 1

  • xarray 4
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
242181620 MDU6SXNzdWUyNDIxODE2MjA= 1475 Allow DataArray to hold cell boundaries as coordinate variables JiaweiZhuang 25473287 open 0     14 2017-07-11T20:58:44Z 2023-08-24T13:21:05Z   NONE      

Cell boundaries can be either N+1 sized arrays as suggested by xgcm/xmitgcm#15, or (N,2) sized arrays as suggested by the CF convention. However, a DataArray cannot hold both kinds of coordinate variables because they contain a new dimension.

If you try to assign a new coordinate to a DataArray by dr.assign_coords(), you will get ValueError: cannot add coordinates with new dimensions to a DataArray

On the other hand, if your DataSet contains cell boundary variables (for example, #667), the bounds will be dropped when you extract a single variable into a DataArray.

Having cell bounds available in a DataArray is important for a couple of applications:

  • Pass cell bounds to DataArray's plotting methods (xgcm/xmitgcm#15). I am aware of the discussion about inferring boundaries (#781). However, for the Cube-Sphere grid or the Lat-Lon-Cap grid (reference) which have tiles covering the poles, I have to explicitly pass cell bounds to the original plt.pcolormesh() to get a good-looking plot. (see this comment for details)

  • For conservative (i.e. area-weighted) regridding (mentioned in #486). Cell centers are enough for bilinear interpolation or other simple resamping, but for any Finite-Volume meshes, knowing the boundaries is crucial if you want to conserve the total amount of mass or flux.

Plotting or regridding will work fine if you pass cell bounds as an additional argument to a wrapper function. However, having a single DataArray object containing boundary information seems like a more elegant solution. Is it possible to let DataArray accept N+1 sized coordinate variables, and be able to inherit them from the parent DataSet? If that's too drastic, is it possible to write an accessor to extend DataArray's capability? Say, a "bound" accessor for a new attribute ds.bnd['lat_b'], which can be kept when a DataArray gets extracted (ds['data_var'].bnd['lat_b'] )? Does this make sense?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1475/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
    xarray 13221727 issue
340486433 MDU6SXNzdWUzNDA0ODY0MzM= 2281 Does interp() work on curvilinear grids (2D coordinates) ? JiaweiZhuang 25473287 open 0     28 2018-07-12T04:36:43Z 2020-09-04T19:24:32Z   NONE      

I am evaluating interp() against xESMF. Here's how xESMF would regrid the built-in 'rasm' data to a regular lat-lon grid.

Seems like interp() can convert rectilinear grids (1D coordinates) to curvilinear grids (2D coordinates), according to the last example. How about the reverse? Can it convert 2D coordinate to 1D, or to another 2D coordinate?

That's the test data: dr = xr.tutorial.load_dataset('rasm')['Tair'] ... Coordinates: * time (time) datetime64[ns] 1980-09-16T12:00:00 1980-10-17 ... xc (y, x) float64 189.2 189.4 189.6 189.7 189.9 190.1 190.2 190.4 ... yc (y, x) float64 16.53 16.78 17.02 17.27 17.51 17.76 18.0 18.25 ...

That's a simple destination grid: lon = xr.DataArray(np.linspace(-180, 180, 120), dims=['lon']) lat = xr.DataArray(np.linspace(-90, 90, 60), dims=['lat'])

I would expect a syntax like: dr_out = dr.interp(xc=lon, yc=lat)

But I got ValueError: dimensions ['xc', 'yc'] do not exist, becauseinterp() only accepts dimensions (which must be 1D), not coordinates.

dr.interp(x=lon, y=lat) runs but the result is not correct. This is expected because x does not mean longitude in the original data.

@crusaderky @fujiisoup

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2281/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
282916278 MDU6SXNzdWUyODI5MTYyNzg= 1789 Broken link to github on ReadTheDocs JiaweiZhuang 25473287 closed 0     8 2017-12-18T15:24:04Z 2018-10-25T16:24:27Z 2018-10-25T16:24:27Z NONE      

The "Edit on GitHub" button on ReadTheDocs is linked to https://github.com/pydata/xarray/blob/origin/stable/doc/index.rst which doesn't exist. The correct URL would be without "origin": https://github.com/pydata/xarray/blob/stable/doc/index.rst

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1789/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
298834332 MDU6SXNzdWUyOTg4MzQzMzI= 1931 apply_ufunc produces illegal coordinate sizes JiaweiZhuang 25473287 closed 0     5 2018-02-21T03:58:19Z 2018-05-31T15:40:04Z 2018-05-31T15:40:04Z NONE      

If func changes the size of the core dimension, apply_ufunc(func, ..) will only change the data variable dimension but keep the coordinate dimension and value unchanged. The resulting DataArray cannot be saved to a NetCDF file due to dimension inconsistency. Please see this GitHub gist for a simple illustration.

Would it be more natural to drop the core coordinate by default? This is safer (will not produce illegal NetCDF file) and also makes more physical sense (the core coordinate is likely to change).

I'd like use apply_ufunc to track metadata in my xESMF package, but the output DataArray will have incorrect lon and lat coordinate dimension and values. It is easy to manually correct this afterwards but I want to bring up this issue.

Output of xr.show_versions()

commit: None python: 3.6.2.final.0 python-bits: 64 OS: Darwin OS-release: 16.7.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0 pandas: 0.22.0 numpy: 1.13.3 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: 0.5.0 Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.0 matplotlib: 2.0.2 cartopy: 0.15.1 seaborn: 0.8.0 setuptools: 38.2.5 pip: 9.0.1 conda: None pytest: 3.2.5 IPython: 6.1.0 sphinx: 1.6.5
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1931/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 19.159ms · About: xarray-datasette