html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/5734#issuecomment-1126851894,https://api.github.com/repos/pydata/xarray/issues/5734,1126851894,IC_kwDOAMm_X85DKmU2,5635139,2022-05-15T03:30:12Z,2022-05-15T03:33:49Z,MEMBER,Congratulations @dcherian & @Illviljan & [edit] @andersy005 !!! ,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1126852038,https://api.github.com/repos/pydata/xarray/issues/5734,1126852038,IC_kwDOAMm_X85DKmXG,2448579,2022-05-15T03:31:50Z,2022-05-15T03:31:50Z,MEMBER,and @andersy005 !,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1126255398,https://api.github.com/repos/pydata/xarray/issues/5734,1126255398,IC_kwDOAMm_X85DIUsm,1217238,2022-05-13T16:51:24Z,2022-05-13T16:51:24Z,MEMBER,👍 this looks great to me!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1125236627,https://api.github.com/repos/pydata/xarray/issues/5734,1125236627,IC_kwDOAMm_X85DEb-T,2448579,2022-05-12T17:14:13Z,2022-05-12T17:14:13Z,MEMBER,"> a global/context option that changes the default value of method
Unfortunately the optimal method depends on distribution of group labels across chunks, so a global option doesn't make sense. It would make sense to create a `method=""auto""` and use that but it doesn't exist yet (""cohorts"" is closest)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1124194834,https://api.github.com/repos/pydata/xarray/issues/5734,1124194834,IC_kwDOAMm_X85DAdoS,2448579,2022-05-11T19:14:24Z,2022-05-11T19:15:44Z,MEMBER,"Thanks for testing it out! I was going to ping xclim when this finally got merged. Presumably you haven't found any bugs?
---
You can pass `method` as `.mean(..., method=...)`. Clearly this needs docs :)
We could actually consider adding `flox_kwargs` to the groupby constructor since a method is really only dependent on the distribution of group labels across the chunks. Right now, I'd just like this to get merged :)
For resampling-type, we are using cohorts by default which generalizes to blockwise when applicable but is slower at graph-construction time. Note you can only blockwise if all members of a group are in a single block. So if you are resampling to yearly but a year of data occupies multiple chunks, you want ""cohorts"", not ""blockwise"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1122662880,https://api.github.com/repos/pydata/xarray/issues/5734,1122662880,IC_kwDOAMm_X85C6nng,14371165,2022-05-10T17:15:46Z,2022-05-10T17:15:46Z,MEMBER,"Yay, mypy is passing now after flox updates. :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1119072230,https://api.github.com/repos/pydata/xarray/issues/5734,1119072230,IC_kwDOAMm_X85Cs6_m,14371165,2022-05-05T21:40:36Z,2022-05-05T21:40:36Z,MEMBER,"Copy/pasteing the Resample-class removes the rest of the errors. Not especially elegant though, maybe there's a better way?
```
xarray/core/dask_array_compat.py:8: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/backends/locks.py:10: error: Cannot assign to a type [misc]
xarray/backends/locks.py:10: error: Incompatible types in assignment (expression has type ""Type[Lock]"", variable has type ""Type[SerializableLock]"") [assignment]
Installing missing stub packages:
xarray/backends/locks.py:15: error: Cannot assign to a type [misc]
/usr/share/miniconda/envs/xarray-tests/bin/python -m pip install types-PyYAML types-paramiko types-python-dateutil types-pytz types-setuptools
xarray/backends/locks.py:15: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Lock]"") [assignment]
xarray/core/types.py:19: error: Cannot assign to a type [misc]
xarray/core/types.py:19: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
Found 25 errors in 14 files (checked 140 source files)
xarray/core/duck_array_ops.py:33: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/nanops.py:14: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/_reductions.py:15: error: Skipping analyzing ""flox"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/_reductions.py:15: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
xarray/core/groupby.py:584: error: Skipping analyzing ""flox.xarray"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/dataset.py:113: error: Cannot assign to a type [misc]
xarray/core/dataset.py:113: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/dataset.py:1761: error: Incompatible return value type (got ""Union[Tuple[ArrayWriter, AbstractDataStore], bytes, Delayed, None]"", expected ""Union[bytes, Delayed, None]"") [return-value]
xarray/core/dataarray.py:70: error: Cannot assign to a type [misc]
xarray/core/dataarray.py:70: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/computation.py:1864: error: Overloaded function implementation cannot satisfy signature 2 due to inconsistencies in how they use type variables [misc]
xarray/core/computation.py:1864: error: Overloaded function implementation cannot satisfy signature 3 due to inconsistencies in how they use type variables [misc]
xarray/core/_typed_ops.pyi:24: error: Cannot assign to a type [misc]
xarray/core/_typed_ops.pyi:24: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
xarray/backends/api.py:38: error: Cannot assign to a type [misc]
xarray/backends/api.py:38: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/tests/test_computation.py:2019: error: No overload variant of ""polyval"" matches argument types ""Any"", ""Any"" [call-overload]
xarray/tests/test_computation.py:2019: note: Possible overload variants:
xarray/tests/test_computation.py:2019: note: def polyval(coord: DataArray, coeffs: DataArray, degree_dim: Hashable) -> DataArray
xarray/tests/test_computation.py:2019: note: def [T_Xarray in (DataArray, Dataset)] polyval(coord: T_Xarray, coeffs: Dataset, degree_dim: Hashable) -> Dataset
xarray/tests/test_computation.py:2019: note: def [T_Xarray in (DataArray, Dataset)] polyval(coord: Dataset, coeffs: T_Xarray, degree_dim: Hashable) -> Dataset
xarray/tests/test_testing.py:13: error: Cannot infer type of lambda [misc]
xarray/tests/test_testing.py:13: error: Incompatible types in assignment (expression has type ""Callable[[Any], Any]"", variable has type ""Callable[[Any, Any, Any, Any, Any, Any, Any, Any, Any], Any]"") [assignment]
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1119043959,https://api.github.com/repos/pydata/xarray/issues/5734,1119043959,IC_kwDOAMm_X85Cs0F3,14371165,2022-05-05T21:03:37Z,2022-05-05T21:03:37Z,MEMBER,"Ok, this kind of works. But there's a new error because I defined the Data*ResampleBase-classes with a for-loop.
```
Found 32 errors in 15 files (checked 140 source files)
xarray/backends/locks.py:10: error: Incompatible types in assignment (expression has type ""Type[Lock]"", variable has type ""Type[SerializableLock]"") [assignment]
xarray/backends/locks.py:15: error: Cannot assign to a type [misc]
xarray/backends/locks.py:15: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Lock]"") [assignment]
xarray/core/types.py:19: error: Cannot assign to a type [misc]
xarray/core/types.py:19: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
xarray/core/duck_array_ops.py:33: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/nanops.py:14: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/_reductions.py:15: error: Skipping analyzing ""flox"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/_reductions.py:15: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
xarray/core/groupby.py:584: error: Skipping analyzing ""flox.xarray"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/dataset.py:113: error: Cannot assign to a type [misc]
xarray/core/dataset.py:113: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/dataset.py:1761: error: Incompatible return value type (got ""Union[Tuple[ArrayWriter, AbstractDataStore], bytes, Delayed, None]"", expected ""Union[bytes, Delayed, None]"") [return-value]
xarray/core/dataarray.py:70: error: Cannot assign to a type [misc]
xarray/core/dataarray.py:70: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/computation.py:1864: error: Overloaded function implementation cannot satisfy signature 2 due to inconsistencies in how they use type variables [misc]
xarray/core/computation.py:1864: error: Overloaded function implementation cannot satisfy signature 3 due to inconsistencies in how they use type variables [misc]
xarray/core/_typed_ops.pyi:24: error: Cannot assign to a type [misc]
xarray/core/_typed_ops.pyi:24: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
xarray/core/resample.py:14: error: Variable ""xarray.core.resample.GroupByBase"" is not valid as a type [valid-type]
xarray/core/resample.py:14: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
xarray/core/resample.py:14: error: Invalid base class ""GroupByBase"" [misc]
xarray/core/resample.py:189: error: No overload variant of ""__setitem__"" of ""list"" matches argument types ""int"", ""Type[_Resample]"" [call-overload]
xarray/core/resample.py:189: note: Possible overload variants:
xarray/core/resample.py:189: note: def __setitem__(self, SupportsIndex, None) -> None
xarray/core/resample.py:189: note: def __setitem__(self, slice, Iterable[None]) -> None
xarray/core/resample.py:194: error: Variable ""xarray.core.resample.DataArrayResampleBase"" is not valid as a type [valid-type]
xarray/core/resample.py:194: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
xarray/core/resample.py:194: error: Invalid base class ""DataArrayResampleBase"" [misc]
xarray/core/resample.py:285: error: Variable ""xarray.core.resample.DatasetResampleBase"" is not valid as a type [valid-type]
xarray/core/resample.py:285: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
xarray/core/resample.py:285: error: Invalid base class ""DatasetResampleBase"" [misc]
xarray/backends/api.py:38: error: Cannot assign to a type [misc]
xarray/backends/api.py:38: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/tests/test_computation.py:2019: error: No overload variant of ""polyval"" matches argument types ""Any"", ""Any"" [call-overload]
xarray/tests/test_computation.py:2019: note: Possible overload variants:
xarray/tests/test_computation.py:2019: note: def polyval(coord: DataArray, coeffs: DataArray, degree_dim: Hashable) -> DataArray
xarray/tests/test_computation.py:2019: note: def [T_Xarray in (DataArray, Dataset)] polyval(coord: T_Xarray, coeffs: Dataset, degree_dim: Hashable) -> Dataset
xarray/tests/test_computation.py:2019: note: def [T_Xarray in (DataArray, Dataset)] polyval(coord: Dataset, coeffs: T_Xarray, degree_dim: Hashable) -> Dataset
xarray/tests/test_testing.py:13: error: Cannot infer type of lambda [misc]
xarray/tests/test_testing.py:13: error: Incompatible types in assignment (expression has type ""Callable[[Any], Any]"", variable has type ""Callable[[Any, Any, Any, Any, Any, Any, Any, Any, Any], Any]"") [assignment]
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1117613319,https://api.github.com/repos/pydata/xarray/issues/5734,1117613319,IC_kwDOAMm_X85CnW0H,2448579,2022-05-04T17:26:08Z,2022-05-04T17:26:08Z,MEMBER,Thanks @Illviljan I'm having trouble getting the inheritance order right and keeping mypy happy. Help is very welcome!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1117586922,https://api.github.com/repos/pydata/xarray/issues/5734,1117586922,IC_kwDOAMm_X85CnQXq,14371165,2022-05-04T16:57:40Z,2022-05-04T16:57:40Z,MEMBER,"There's a bit of mypy errors now. Maybe just removíng `_flox_reduce` in `_reductions.py` is enough?
```
Successfully installed types-PyYAML-6.0.7 types-cryptography-3.3.21 types-paramiko-2.10.0 types-python-dateutil-2.8.14 types-pytz-2021.3.7 types-setuptools-57.4.14
xarray/backends/locks.py:10: error: Cannot assign to a type [misc]
xarray/backends/locks.py:10: error: Incompatible types in assignment (expression has type ""Type[Lock]"", variable has type ""Type[SerializableLock]"") [assignment]
xarray/backends/locks.py:15: error: Cannot assign to a type [misc]
xarray/backends/locks.py:15: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Lock]"") [assignment]
xarray/core/types.py:19: error: Cannot assign to a type [misc]
xarray/core/types.py:19: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
xarray/core/dask_array_compat.py:11: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/duck_array_ops.py:33: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/nanops.py:14: error: Incompatible types in assignment (expression has type ""None"", variable has type Module) [assignment]
xarray/core/_reductions.py:15: error: Skipping analyzing ""flox"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/_reductions.py:15: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
xarray/core/_reductions.py:3194: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3203: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3278: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3287: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3362: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3371: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3462: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3472: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3564: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3574: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3670: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3680: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3793: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3804: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:3918: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:3929: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:4040: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:4051: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:4162: error: ""DatasetResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:4173: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:4269: error: ""DatasetResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5418: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5426: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5495: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5503: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5572: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5580: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5663: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5672: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5756: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5765: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5853: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5862: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:5965: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:5975: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:6079: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:6089: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:6190: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:6200: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:6301: error: ""DataArrayResampleReductions"" has no attribute ""_flox_reduce"" [attr-defined]
xarray/core/_reductions.py:6311: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/_reductions.py:6399: error: ""DataArrayResampleReductions"" has no attribute ""reduce"" [attr-defined]
xarray/core/groupby.py:584: error: Skipping analyzing ""flox.xarray"": module is installed, but missing library stubs or py.typed marker [import]
xarray/core/dataset.py:113: error: Cannot assign to a type [misc]
xarray/core/dataset.py:113: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/dataset.py:1761: error: Incompatible return value type (got ""Union[Tuple[ArrayWriter, AbstractDataStore], bytes, Delayed, None]"", expected ""Union[bytes, Delayed, None]"") [return-value]
xarray/core/dataarray.py:70: error: Cannot assign to a type [misc]
xarray/core/dataarray.py:70: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/core/_typed_ops.pyi:24: error: Cannot assign to a type [misc]
xarray/core/_typed_ops.pyi:24: error: Incompatible types in assignment (expression has type ""Type[ndarray[Any, Any]]"", variable has type ""Type[Array]"") [assignment]
xarray/backends/api.py:38: error: Cannot assign to a type [misc]
xarray/backends/api.py:38: error: Incompatible types in assignment (expression has type ""None"", variable has type ""Type[Delayed]"") [assignment]
xarray/tests/test_testing.py:13: error: Cannot infer type of lambda [misc]
xarray/tests/test_testing.py:13: error: Incompatible types in assignment (expression has type ""Callable[[Any], Any]"", variable has type ""Callable[[Any, Any, Any, Any, Any, Any, Any, Any, Any], Any]"") [assignment]
Installing missing stub packages:
/usr/share/miniconda/envs/xarray-tests/bin/python -m pip install types-PyYAML types-paramiko types-python-dateutil types-pytz types-setuptools
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1117497457,https://api.github.com/repos/pydata/xarray/issues/5734,1117497457,IC_kwDOAMm_X85Cm6hx,2448579,2022-05-04T15:33:30Z,2022-05-04T15:34:06Z,MEMBER,"@pydata/xarray This is ready to go. It's mostly one adaptor function and a lot of new tests. It does need docs, I can add that in a future PR.
By default, we use a strategy (""split-reduce"") that is very similar to our current one with dask arrays, so users will have to [explicitly choose a new strategy](https://flox.readthedocs.io/en/latest/implementation.html) to see much improvements. For resampling we can choose a sensible default that should show only improvements, and no regressions (""cohorts"")","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1092097037,https://api.github.com/repos/pydata/xarray/issues/5734,1092097037,IC_kwDOAMm_X85BGBQN,2448579,2022-04-07T19:00:28Z,2022-04-07T19:00:28Z,MEMBER,@pydata/xarray this is blocked by https://github.com/pydata/xarray/issues/6430 but is ready for review.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-1012599465,https://api.github.com/repos/pydata/xarray/issues/5734,1012599465,IC_kwDOAMm_X848Wwqp,14371165,2022-01-13T23:06:43Z,2022-01-13T23:10:12Z,MEMBER,"This is a strict improvement now, well done!
```
before after ratio
[4c865d60] [9b13c974]
- 18.7±0.3ms 11.0±0.1ms 0.59 groupby.GroupByDask.time_agg_small_num_groups('sum', 2)
- 24.3±1ms 13.8±0.2ms 0.57 groupby.ResampleDask.time_agg_small_num_groups('mean', 2)
- 22.0±0.2ms 12.0±0.3ms 0.54 groupby.ResampleDask.time_agg_small_num_groups('mean', 1)
- 13.5±0.4ms 7.28±0.3ms 0.54 groupby.GroupByDask.time_agg_small_num_groups('sum', 1)
- 33.7±0.4ms 14.2±0.3ms 0.42 groupby.ResampleDask.time_agg_small_num_groups('sum', 2)
- 30.7±0.4ms 11.9±0.2ms 0.39 groupby.ResampleDask.time_agg_small_num_groups('sum', 1)
- 104±1ms 8.21±0.08ms 0.08 groupby.Resample.time_agg_large_num_groups('sum', 1)
- 119±1ms 8.84±0.2ms 0.07 groupby.Resample.time_agg_large_num_groups('sum', 2)
- 115±2ms 8.45±0.2ms 0.07 groupby.Resample.time_agg_large_num_groups('mean', 1)
- 134±1ms 9.60±0.2ms 0.07 groupby.Resample.time_agg_large_num_groups('mean', 2)
- 257±3ms 10.2±0.3ms 0.04 groupby.GroupBy.time_agg_large_num_groups('mean', 2)
- 260±5ms 9.99±0.2ms 0.04 groupby.GroupBy.time_agg_large_num_groups('sum', 2)
- 190±6ms 6.47±0.2ms 0.03 groupby.GroupBy.time_agg_large_num_groups('mean', 1)
- 193±2ms 6.31±0.2ms 0.03 groupby.GroupBy.time_agg_large_num_groups('sum', 1)
- 368±5ms 10.9±0.2ms 0.03 groupby.GroupByDask.time_agg_large_num_groups('mean', 2)
- 494±2ms 13.7±0.3ms 0.03 groupby.ResampleDask.time_agg_large_num_groups('mean', 2)
- 451±8ms 12.3±0.3ms 0.03 groupby.ResampleDask.time_agg_large_num_groups('mean', 1)
- 318±3ms 7.13±0.04ms 0.02 groupby.GroupByDask.time_agg_large_num_groups('mean', 1)
- 531±6ms 10.6±0.2ms 0.02 groupby.GroupByDask.time_agg_large_num_groups('sum', 2)
- 814±10ms 14.0±0.4ms 0.02 groupby.ResampleDask.time_agg_large_num_groups('sum', 2)
- 760±8ms 12.0±0.3ms 0.02 groupby.ResampleDask.time_agg_large_num_groups('sum', 1)
- 490±6ms 7.13±0.2ms 0.01 groupby.GroupByDask.time_agg_large_num_groups('sum', 1)
```
Earlier benchmarks:
```
before after ratio
[5d30f96e] [0ad0dfde]
- 16.7±0.3ms 10.3±0.5ms 0.62 groupby.GroupByDask.time_agg_small_num_groups('sum', 2)
- 21.5±0.8ms 13.1±0.5ms 0.61 groupby.ResampleDask.time_agg_small_num_groups('mean', 1)
- 24.5±1ms 14.4±0.8ms 0.59 groupby.ResampleDask.time_agg_small_num_groups('mean', 2)
- 12.8±0.6ms 7.08±0.3ms 0.55 groupby.GroupByDask.time_agg_small_num_groups('sum', 1)
- 28.9±0.9ms 13.2±0.4ms 0.46 groupby.ResampleDask.time_agg_small_num_groups('sum', 1)
- 31.5±0.5ms 14.2±1ms 0.45 groupby.ResampleDask.time_agg_small_num_groups('sum', 2)
- 110±4ms 10.6±0.4ms 0.10 groupby.Resample.time_agg_large_num_groups('sum', 2)
- 96.2±5ms 8.80±0.4ms 0.09 groupby.Resample.time_agg_large_num_groups('mean', 1)
- 127±3ms 10.9±0.6ms 0.09 groupby.Resample.time_agg_large_num_groups('mean', 2)
- 95.9±2ms 7.44±0.4ms 0.08 groupby.Resample.time_agg_large_num_groups('sum', 1)
- 211±6ms 9.99±0.4ms 0.05 groupby.GroupBy.time_agg_large_num_groups('mean', 2)
- 219±8ms 10.4±1ms 0.05 groupby.GroupBy.time_agg_large_num_groups('sum', 2)
- 154±3ms 6.86±0.4ms 0.04 groupby.GroupBy.time_agg_large_num_groups('mean', 1)
- 163±3ms 6.83±0.2ms 0.04 groupby.GroupBy.time_agg_large_num_groups('sum', 1)
- 330±3ms 10.6±0.6ms 0.03 groupby.GroupByDask.time_agg_large_num_groups('mean', 2)
- 446±10ms 14.2±0.3ms 0.03 groupby.ResampleDask.time_agg_large_num_groups('mean', 2)
- 413±9ms 12.1±0.5ms 0.03 groupby.ResampleDask.time_agg_large_num_groups('mean', 1)
- 265±2ms 7.52±0.2ms 0.03 groupby.GroupByDask.time_agg_large_num_groups('mean', 1)
- 469±10ms 10.3±0.5ms 0.02 groupby.GroupByDask.time_agg_large_num_groups('sum', 2)
- 739±10ms 14.0±0.8ms 0.02 groupby.ResampleDask.time_agg_large_num_groups('sum', 2)
- 678±10ms 12.0±1ms 0.02 groupby.ResampleDask.time_agg_large_num_groups('sum', 1)
- 434±20ms 7.05±0.5ms 0.02 groupby.GroupByDask.time_agg_large_num_groups('sum', 1)
before after ratio
[5d30f96e] [0ad0dfde]
+ 3.89±0.1ms 6.62±0.4ms 1.70 groupby.GroupBy.time_agg_small_num_groups('sum', 1)
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-966624963,https://api.github.com/repos/pydata/xarray/issues/5734,966624963,IC_kwDOAMm_X845nYbD,2448579,2021-11-11T21:05:54Z,2021-11-11T21:05:54Z,MEMBER,This builds on #5950 so that should be reviewed and merged first.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-933160264,https://api.github.com/repos/pydata/xarray/issues/5734,933160264,IC_kwDOAMm_X843nuVI,2448579,2021-10-04T05:42:51Z,2021-11-11T20:58:53Z,MEMBER,"!!!
The only failures are in `test_units.py` so now I think we can figure out how to implement this cleanly.
```
FAILED xarray/tests/test_units.py::TestDataArray::test_computation_objects[float64-method_groupby-data]
FAILED xarray/tests/test_units.py::TestDataArray::test_computation_objects[float64-method_groupby_bins-data]
FAILED xarray/tests/test_units.py::TestDataArray::test_computation_objects[int64-method_groupby-data]
FAILED xarray/tests/test_units.py::TestDataArray::test_computation_objects[int64-method_groupby_bins-data]
FAILED xarray/tests/test_units.py::TestDataArray::test_resample[float64] - pi...
FAILED xarray/tests/test_units.py::TestDataArray::test_resample[int64] - pint...
FAILED xarray/tests/test_units.py::TestDataset::test_computation_objects[float64-data-method_groupby_bins]
FAILED xarray/tests/test_units.py::TestDataset::test_computation_objects[int64-data-method_groupby_bins]
FAILED xarray/tests/test_units.py::TestDataset::test_resample[float64-data]
FAILED xarray/tests/test_units.py::TestDataset::test_resample[int64-data] - p...
```
I like @max-sixty's suggestion of generating the reductions like `generate_ops.py`. It seems like a good first step would be to refactor the existing reductions in a separate PR.
","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-965574480,https://api.github.com/repos/pydata/xarray/issues/5734,965574480,IC_kwDOAMm_X845jX9Q,2448579,2021-11-10T17:31:22Z,2021-11-10T21:52:17Z,MEMBER,OK CI isn't using the numpy_groupies code path for reasons I don't understand. Does anyone see a reason why this might happen?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-964409224,https://api.github.com/repos/pydata/xarray/issues/5734,964409224,IC_kwDOAMm_X845e7eI,2448579,2021-11-09T18:14:38Z,2021-11-09T18:14:38Z,MEMBER,"Benchmarks are looking good (npg=True means use numpy groupies). Big gains (10-20x) for large number of groups (100), especially with dask.
```
[ 2.78%] ··· groupby.GroupBy.time_agg_large_num_groups ok
[ 2.78%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 8.38±0ms 101±0ms 9.54±0ms 136±0ms
mean 7.12±0ms 101±0ms 9.74±0ms 148±0ms
======== ========== =========== ========== ===========
[ 5.56%] ··· groupby.GroupBy.time_agg_small_num_groups ok
[ 5.56%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 8.27±0ms 4.55±0ms 9.07±0ms 8.46±0ms
mean 7.19±0ms 4.50±0ms 9.24±0ms 8.36±0ms
======== ========== =========== ========== ===========
[ 8.33%] ··· groupby.GroupBy.time_init ok
[ 8.33%] ··· ====== ==========
ndim
------ ----------
1 1.72±0ms
2 4.06±0ms
====== ==========
[ 11.11%] ··· groupby.GroupByDask.time_agg_large_num_groups ok
[ 11.11%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 8.41±0ms 202±0ms 9.93±0ms 226±0ms
mean 7.83±0ms 197±0ms 10.7±0ms 213±0ms
======== ========== =========== ========== ===========
[ 13.89%] ··· groupby.GroupByDask.time_agg_small_num_groups ok
[ 13.89%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 8.41±0ms 8.99±0ms 10.5±0ms 12.5±0ms
mean 7.98±0ms 8.67±0ms 10.1±0ms 12.2±0ms
======== ========== =========== ========== ===========
[ 16.67%] ··· groupby.GroupByDask.time_init ok
[ 16.67%] ··· ====== ==========
ndim
------ ----------
1 1.77±0ms
2 4.06±0ms
====== ==========
```
```
[ 36.11%] ··· groupby.Resample.time_agg_large_num_groups ok
[ 36.11%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 17.2±0ms 83.3±0ms 17.0±0ms 93.5±0ms
mean 15.5±0ms 91.0±0ms 17.4±0ms 101±0ms
======== ========== =========== ========== ===========
[ 38.89%] ··· groupby.Resample.time_agg_small_num_groups ok
[ 38.89%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 16.7±0ms 12.3±0ms 16.7±0ms 13.3±0ms
mean 15.2±0ms 12.5±0ms 19.3±0ms 13.9±0ms
======== ========== =========== ========== ===========
[ 41.67%] ··· groupby.Resample.time_init ok
[ 41.67%] ··· ====== ==========
ndim
------ ----------
1 7.46±0ms
2 7.26±0ms
====== ==========
[ 44.44%] ··· groupby.ResampleDask.time_agg_large_num_groups ok
[ 44.44%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 22.3±0ms 561±0ms 28.3±0ms 607±0ms
mean 22.2±0ms 344±0ms 27.3±0ms 371±0ms
======== ========== =========== ========== ===========
[ 47.22%] ··· groupby.ResampleDask.time_agg_small_num_groups ok
[ 47.22%] ··· ======== ========== =========== ========== ===========
-- ndim / npg
-------- ---------------------------------------------
method 1 / True 1 / False 2 / True 2 / False
======== ========== =========== ========== ===========
sum 17.7±0ms 31.2±0ms 20.0±0ms 34.2±0ms
mean 17.2±0ms 24.4±0ms 19.9±0ms 26.6±0ms
======== ========== =========== ========== ===========
[ 50.00%] ··· groupby.ResampleDask.time_init ok
[ 50.00%] ··· ====== ==========
ndim
------ ----------
1 7.43±0ms
2 6.91±0ms
==
```","{""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 3, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-963568052,https://api.github.com/repos/pydata/xarray/issues/5734,963568052,IC_kwDOAMm_X845buG0,2448579,2021-11-08T21:00:04Z,2021-11-08T21:00:04Z,MEMBER,"> Maybe it's also on the pint side? Even if numpy_groupies supports the like argument it will crash because pint doesn't support asanyarray.
cc @keewis ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-956492667,https://api.github.com/repos/pydata/xarray/issues/5734,956492667,IC_kwDOAMm_X845Aut7,14371165,2021-11-01T18:44:43Z,2021-11-01T18:44:43Z,MEMBER,"Maybe it's also on the `pint` side? Even if `numpy_groupies` supports the `like` argument it will crash because pint doesn't support `asanyarray`.
```python
import numpy as np
import pint
import dask.array as da
# pint crashes:
np.asanyarray([1, 2], like=pint.Quantity(1, ""s""))
Traceback (most recent call last):
File """", line 1, in
np.asanyarray([1, 2], like=pint.Quantity(1, ""s""))
TypeError: no implementation found for 'numpy.asanyarray' on types that implement __array_function__: []
# dask supports it:
np.asanyarray([1, 2], like=da.array(0))
Out[12]: dask.array
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-954290212,https://api.github.com/repos/pydata/xarray/issues/5734,954290212,IC_kwDOAMm_X8444VAk,2448579,2021-10-28T23:11:08Z,2021-10-28T23:11:08Z,MEMBER,"> appears numpy_groupies is forcing the duck arrays to numpy arrays.
yes; this will require upstream changes","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-938906859,https://api.github.com/repos/pydata/xarray/issues/5734,938906859,IC_kwDOAMm_X8439pTr,14371165,2021-10-08T17:20:48Z,2021-10-08T17:20:48Z,MEMBER,"Looking at those unit errors it appears `numpy_groupies` is forcing the duck arrays to numpy arrays. Perhaps adding a `like` in the np.asanyarray() will do the trick?
```python
C:\Miniconda\envs\xarray-tests\lib\site-packages\numpy_groupies\utils_numpy.py:199: in input_validation
a = np.asanyarray(a)
```
https://github.com/ml31415/numpy-groupies/blob/7a31d1f9bbd51111b4a4d01cf1df01a5b4827e85/numpy_groupies/utils_numpy.py#L199
```python
____________________ TestDataset.test_resample[int32-data] ____________________
[gw1] win32 -- Python 3.9.7 C:\Miniconda\envs\xarray-tests\python.exe
self =
variant = 'data', dtype = dtype('int32')
@pytest.mark.parametrize(
""variant"",
(
""data"",
pytest.param(
""dims"", marks=pytest.mark.skip(reason=""indexes don't support units"")
),
""coords"",
),
)
def test_resample(self, variant, dtype):
# TODO: move this to test_computation_objects
variants = {
""data"": ((unit_registry.degK, unit_registry.Pa), 1, 1),
""dims"": ((1, 1), unit_registry.m, 1),
""coords"": ((1, 1), 1, unit_registry.m),
}
(unit1, unit2), dim_unit, coord_unit = variants.get(variant)
array1 = np.linspace(-5, 5, 10 * 5).reshape(10, 5).astype(dtype) * unit1
array2 = np.linspace(10, 20, 10 * 8).reshape(10, 8).astype(dtype) * unit2
t = pd.date_range(""10-09-2010"", periods=array1.shape[0], freq=""1y"")
y = np.arange(5) * dim_unit
z = np.arange(8) * dim_unit
u = np.linspace(-1, 0, 5) * coord_unit
ds = xr.Dataset(
data_vars={""a"": ((""time"", ""y""), array1), ""b"": ((""time"", ""z""), array2)},
coords={""time"": t, ""y"": y, ""z"": z, ""u"": (""y"", u)},
)
units = extract_units(ds)
func = method(""resample"", time=""6m"")
expected = attach_units(func(strip_units(ds)).mean(), units)
> actual = func(ds).mean()
D:\a\xarray\xarray\xarray\tests\test_units.py:5366:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
D:\a\xarray\xarray\xarray\core\groupby.py:602: in wrapped_func
result = xarray_reduce(
C:\Miniconda\envs\xarray-tests\lib\site-packages\dask_groupby\xarray.py:259: in xarray_reduce
actual = xr.apply_ufunc(
D:\a\xarray\xarray\xarray\core\computation.py:1153: in apply_ufunc
return apply_dataset_vfunc(
D:\a\xarray\xarray\xarray\core\computation.py:447: in apply_dataset_vfunc
result_vars = apply_dict_of_variables_vfunc(
D:\a\xarray\xarray\xarray\core\computation.py:391: in apply_dict_of_variables_vfunc
result_vars[name] = func(*variable_args)
D:\a\xarray\xarray\xarray\core\computation.py:733: in apply_variable_ufunc
result_data = func(*input_data)
C:\Miniconda\envs\xarray-tests\lib\site-packages\dask_groupby\xarray.py:232: in wrapper
result, groups = groupby_reduce(*args, **kwargs)
C:\Miniconda\envs\xarray-tests\lib\site-packages\dask_groupby\core.py:1119: in groupby_reduce
results = chunk_reduce(
C:\Miniconda\envs\xarray-tests\lib\site-packages\dask_groupby\core.py:521: in chunk_reduce
result = _get_aggregate(backend)(
C:\Miniconda\envs\xarray-tests\lib\site-packages\numpy_groupies\aggregate_numpy.py:291: in aggregate
return _aggregate_base(group_idx, a, size=size, fill_value=fill_value,
C:\Miniconda\envs\xarray-tests\lib\site-packages\numpy_groupies\aggregate_numpy.py:256: in _aggregate_base
group_idx, a, flat_size, ndim_idx, size = input_validation(group_idx, a,
C:\Miniconda\envs\xarray-tests\lib\site-packages\numpy_groupies\utils_numpy.py:199: in input_validation
a = np.asanyarray(a)
C:\Miniconda\envs\xarray-tests\lib\site-packages\numpy\core\_asarray.py:171: in asanyarray
return array(a, dtype, copy=False, order=order, subok=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
t = None
def __array__(self, t=None):
> warnings.warn(
""The unit of the quantity is stripped when downcasting to ndarray."",
UnitStrippedWarning,
stacklevel=2,
)
E pint.errors.UnitStrippedWarning: The unit of the quantity is stripped when downcasting to ndarray.
C:\Miniconda\envs\xarray-tests\lib\site-packages\pint\quantity.py:1700: UnitStrippedWarning
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586
https://github.com/pydata/xarray/pull/5734#issuecomment-913070347,https://api.github.com/repos/pydata/xarray/issues/5734,913070347,IC_kwDOAMm_X842bFkL,2448579,2021-09-05T01:52:49Z,2021-09-05T01:52:49Z,MEMBER,We don't have any asv benchmarks for groupby currently. It would be good to add some! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,978356586