home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where issue = 412078232 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 3

  • jbusecke 2
  • spencerkclark 1
  • fujiisoup 1

author_association 2

  • CONTRIBUTOR 2
  • MEMBER 2

issue 1

  • Add support for cftime.datetime coordinates with coarsen · 4 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
470249595 https://github.com/pydata/xarray/pull/2778#issuecomment-470249595 https://api.github.com/repos/pydata/xarray/issues/2778 MDEyOklzc3VlQ29tbWVudDQ3MDI0OTU5NQ== fujiisoup 6815844 2019-03-06T19:47:39Z 2019-03-06T19:47:39Z MEMBER

Thanks for the follow up pr. Merging.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add support for cftime.datetime coordinates with coarsen 412078232
470248465 https://github.com/pydata/xarray/pull/2778#issuecomment-470248465 https://api.github.com/repos/pydata/xarray/issues/2778 MDEyOklzc3VlQ29tbWVudDQ3MDI0ODQ2NQ== jbusecke 14314623 2019-03-06T19:44:24Z 2019-03-06T19:44:43Z CONTRIBUTOR

Oh yeah, that seems totally fair to me. Thanks for clarifying. Cant wait to have this functionality! Thanks @spencerkclark

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add support for cftime.datetime coordinates with coarsen 412078232
470240931 https://github.com/pydata/xarray/pull/2778#issuecomment-470240931 https://api.github.com/repos/pydata/xarray/issues/2778 MDEyOklzc3VlQ29tbWVudDQ3MDI0MDkzMQ== spencerkclark 6628425 2019-03-06T19:22:54Z 2019-03-06T19:23:27Z MEMBER

Oh, I should have been a little clearer!

For now I've held off on making these changes dask-compatible (I could do it, but I'm not sure it is worth the extra complexity)

This comment only applies to the changes regarding duck_array_ops.mean, which is used by default on the coordinates involved in coarsen. Since indexes are always loaded into memory, i.e. backed by NumPy arrays, we don't really need to worry about dask-compatibility there. In other words with this PR a DataArray can hold dask array data indexed by a cftime time coordinate, and coarsen will work just fine: ``` In [1]: import xarray as xr

In [2]: import numpy as np

In [3]: data = np.random.random((10, 5))

In [4]: times = xr.cftime_range('2000', periods=10)

In [5]: da = xr.DataArray(data, coords={'time': times}, dims=['time', 'x'])

In [6]: da = da.chunk({'time': 1, 'x': 1})

In [7]: da Out[7]: <xarray.DataArray (time: 10, x: 5)> dask.array<shape=(10, 5), dtype=float64, chunksize=(1, 1)> Coordinates: * time (time) object 2000-01-01 00:00:00 ... 2000-01-10 00:00:00 Dimensions without coordinates: x

In [8]: da.coarsen(time=2).mean() Out[8]: <xarray.DataArray (time: 5, x: 5)> dask.array<shape=(5, 5), dtype=float64, chunksize=(1, 1)> Coordinates: * time (time) object 2000-01-01 12:00:00 ... 2000-01-09 12:00:00 Dimensions without coordinates: x ```

This would only come up as a possible issue if you tried to lazily take the mean of a DataArray of cftime objects, e.g.: ``` In [9]: da = xr.DataArray(times, dims=['t']).chunk()

In [10]: da Out[10]: <xarray.DataArray (t: 10)> dask.array<shape=(10,), dtype=object, chunksize=(10,)> Coordinates: * t (t) object 2000-01-01 00:00:00 ... 2000-01-10 00:00:00

In [11]: da.mean()

NotImplementedError Traceback (most recent call last) <ipython-input-19-c02402258881> in <module> ----> 1 da.mean()

~/xarray-dev/xarray/xarray/core/common.py in wrapped_func(self, dim, axis, skipna, kwargs) 23 kwargs): 24 return self.reduce(func, dim, axis, ---> 25 skipna=skipna, allow_lazy=True, **kwargs) 26 else: 27 def wrapped_func(self, dim=None, axis=None, # type: ignore

~/xarray-dev/xarray/xarray/core/dataarray.py in reduce(self, func, dim, axis, keep_attrs, kwargs) 1603 """ 1604 -> 1605 var = self.variable.reduce(func, dim, axis, keep_attrs, kwargs) 1606 return self._replace_maybe_drop_dims(var) 1607

~/xarray-dev/xarray/xarray/core/variable.py in reduce(self, func, dim, axis, keep_attrs, allow_lazy, kwargs) 1366 data = func(input_data, axis=axis, kwargs) 1367 else: -> 1368 data = func(input_data, **kwargs) 1369 1370 if getattr(data, 'shape', ()) == self.shape:

~/xarray-dev/xarray/xarray/core/duck_array_ops.py in mean(array, axis, skipna, **kwargs) 348 if isinstance(array, dask_array_type): 349 raise NotImplementedError( --> 350 'Computing the mean of an array containing ' 351 'cftime.datetime objects is not yet implemented on ' 352 'dask arrays.')

NotImplementedError: Computing the mean of an array containing cftime.datetime objects is not yet implemented on dask arrays. ``` but I think that's a pretty rare use case, hence why I've held off on adding that support for now.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add support for cftime.datetime coordinates with coarsen 412078232
470226713 https://github.com/pydata/xarray/pull/2778#issuecomment-470226713 https://api.github.com/repos/pydata/xarray/issues/2778 MDEyOklzc3VlQ29tbWVudDQ3MDIyNjcxMw== jbusecke 14314623 2019-03-06T18:45:16Z 2019-03-06T18:45:16Z CONTRIBUTOR

Oh sweet, I just encountered this problem. Would this work on a large dask array with a non-dask time dimension?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add support for cftime.datetime coordinates with coarsen 412078232

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 197.799ms · About: xarray-datasette