home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where issue = 1189140909 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 2

  • benbovy 2
  • dcherian 1

issue 1

  • concat along dim with mix of scalar coordinate and array coordinates is not right · 3 ✖

author_association 1

  • MEMBER 3
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1089206433 https://github.com/pydata/xarray/issues/6434#issuecomment-1089206433 https://api.github.com/repos/pydata/xarray/issues/6434 IC_kwDOAMm_X85A6_ih benbovy 4160723 2022-04-05T19:07:34Z 2022-04-05T19:07:34Z MEMBER

Alternatively, we could loop over datasets and call expand_dims if dim is present and is a scalar.

Ah yes 👍 . Not sure why this case didn't fill the conditions for calling expand_dims on the input datasets.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  concat along dim with mix of scalar coordinate and array coordinates is not right 1189140909
1086751973 https://github.com/pydata/xarray/issues/6434#issuecomment-1086751973 https://api.github.com/repos/pydata/xarray/issues/6434 IC_kwDOAMm_X85AxoTl dcherian 2448579 2022-04-03T01:00:48Z 2022-04-03T02:02:34Z MEMBER

which doesn't seem to create the right kind of index given the value type:

There's a typo in the first line, we need da.time, this does actually work ``` array = da.isel(time=0).time.values value = array.item() seq = np.array([value], dtype=array.dtype) pd.Index(seq, dtype=array.dtype)

DatetimeIndex(['2013-01-01'], dtype='datetime64[ns]', freq=None)

```

The issue is that the .item() converts datetime64 to int and so safe_cast_to_index creates a IntIndex because we don't pass dtype to the pd.Index constructor (so that's one possible fix): https://github.com/pydata/xarray/blob/3ead17ea9e99283e2511b65b9d864d1c7b10b3c4/xarray/core/utils.py#L130-L133

Alternatively, we could loop over datasets and call expand_dims if dim is present and is a scalar. We already do something similar here: https://github.com/pydata/xarray/blob/305533d585389f7240ae2383a323337d4761d33a/xarray/core/concat.py#L470-L472

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  concat along dim with mix of scalar coordinate and array coordinates is not right 1189140909
1086704180 https://github.com/pydata/xarray/issues/6434#issuecomment-1086704180 https://api.github.com/repos/pydata/xarray/issues/6434 IC_kwDOAMm_X85Axco0 benbovy 4160723 2022-04-02T19:08:48Z 2022-04-02T19:08:48Z MEMBER

The first example works because there's no index.

In the second example, a PandasIndex is created from the scalar value (wrapped in a sequence) so that concat works on the "time" dimension (required since the logic has moved to Index.concat). See https://github.com/pydata/xarray/pull/5692#discussion_r820581305.

The problem is when creating a PandasIndex we call pd.Index, which doesn't seem to create the right kind of index given the value type:

```python array = da.isel(time=0).values value = array.item() seq = np.array([value], dtype=array.dtype) pd.Index(seq, dtype=array.dtype)

Float64Index([1.0], dtype='float64')

```

So in the example above you end-up with different index types, which xr.align doesn't like:

```python concat.indexes["time"]

Index([1356998400000000000, 2013-01-01 06:00:00], dtype='object', name='time')

da.indexes["time"]

DatetimeIndex(['2013-01-01 00:00:00', '2013-01-01 06:00:00'], dtype='datetime64[ns]', name='time', freq=None)

concat.indexes["time"].equals(da.indexes["time"])

False

```

I'm not very satisfied with the current solution in concat but I'm not sure what we should do here:

  • Special case for datetime, and other value types?
  • Review the approach used to concatenate scalar coordinates (no-index) and indexed array coordinates?
  • Depreciate concatenating a mix of scalar coordinates and indexed coordinates?
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  concat along dim with mix of scalar coordinate and array coordinates is not right 1189140909

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 723.531ms · About: xarray-datasette