html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/pull/4746#issuecomment-766462310,https://api.github.com/repos/pydata/xarray/issues/4746,766462310,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjMxMA==,5635139,2021-01-24T23:48:00Z,2021-01-24T23:48:00Z,MEMBER,Let me know any post-merge feedback and I'll make the changes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-763903671,https://api.github.com/repos/pydata/xarray/issues/4746,763903671,MDEyOklzc3VlQ29tbWVudDc2MzkwMzY3MQ==,5635139,2021-01-20T20:12:36Z,2021-01-20T20:12:36Z,MEMBER,Any final feedback before merging?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-760532922,https://api.github.com/repos/pydata/xarray/issues/4746,760532922,MDEyOklzc3VlQ29tbWVudDc2MDUzMjkyMg==,5635139,2021-01-14T23:07:16Z,2021-01-14T23:07:16Z,MEMBER,Would anyone know whether the docs failure is related to this PR? I can't see anything in the log apart from matplotlib warnings? https://readthedocs.org/projects/xray/builds/12768566/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-760532153,https://api.github.com/repos/pydata/xarray/issues/4746,760532153,MDEyOklzc3VlQ29tbWVudDc2MDUzMjE1Mw==,5635139,2021-01-14T23:05:16Z,2021-01-14T23:05:16Z,MEMBER,"I double-checked the benchmarks and added a pandas comparison. That involved ensuring the missing value was handled corre them and ensured the setup wasn't in the benchmark. I don't get the 100x speed-up that I thought I saw initially; it's now more like 8x. Still decent! I'm not sure whether that's because I misread the benchmark previously or because the benchmarks are slightly different — I guess the first. Pasting below the results so we have something concrete. Existing ``` asv profile unstacking.Unstacking.time_unstack_slow master | head -n 20 ··· unstacking.Unstacking.time_unstack_slow 861±20ms ``` Proposed ``` asv profile unstacking.Unstacking.time_unstack_slow HEAD | head -n 20 ··· unstacking.Unstacking.time_unstack_slow 108±3ms ``` Pandas ``` asv profile unstacking.Unstacking.time_unstack_pandas_slow master | head -n 20 ··· unstacking.Unstacking.time_unstack_pandas_slow 207±10ms ``` Are we OK with the claim vs pandas? I think it's important that we make accurate comparisons (both good and bad) but open-minded if it seems a bit aggressive. Worth someone reviewing the code in the benchmark to ensure I haven't made a mistake.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-758143460,https://api.github.com/repos/pydata/xarray/issues/4746,758143460,MDEyOklzc3VlQ29tbWVudDc1ODE0MzQ2MA==,5635139,2021-01-11T18:37:48Z,2021-01-11T18:37:48Z,MEMBER,Any final comments before merging?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-755420263,https://api.github.com/repos/pydata/xarray/issues/4746,755420263,MDEyOklzc3VlQ29tbWVudDc1NTQyMDI2Mw==,5635139,2021-01-06T16:48:30Z,2021-01-06T16:48:30Z,MEMBER,"As discussed in dev meeting, https://github.com/dask/dask/pull/7033 would allow dask to use the fast path, and likely eventually for our existing path to be removed","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-754401859,https://api.github.com/repos/pydata/xarray/issues/4746,754401859,MDEyOklzc3VlQ29tbWVudDc1NDQwMTg1OQ==,5635139,2021-01-05T05:14:27Z,2021-01-05T05:14:27Z,MEMBER,"Tests now pass after merging master, not sure whether the previous tests were flaky vs something upstream changed... Ready for a final review","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753778517,https://api.github.com/repos/pydata/xarray/issues/4746,753778517,MDEyOklzc3VlQ29tbWVudDc1Mzc3ODUxNw==,5635139,2021-01-04T06:11:17Z,2021-01-04T06:11:17Z,MEMBER,"This still seems to be getting a bunch of pint failures, like this one: https://dev.azure.com/xarray/xarray/_build/results?buildId=4657&view=ms.vss-test-web.build-test-results-tab&runId=73796&resultId=110407&paneView=debug I confused, since this PR now has no mention of `pint` and I don't see any mention of unstacking in those test failures. I suspect I'm missing something. Any ideas for what could be happening?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753683611,https://api.github.com/repos/pydata/xarray/issues/4746,753683611,MDEyOklzc3VlQ29tbWVudDc1MzY4MzYxMQ==,5635139,2021-01-03T22:14:17Z,2021-01-03T22:14:17Z,MEMBER,"Until https://github.com/pydata/xarray/pull/4751 is resolved, I've taken out the explicit pint check and replaced with a numpy check. The code is a bit messy now — now two levels of comments. But I've put references in, so it should be tractable. Lmk any feedback. Otherwise this is ready to go from my end.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753430518,https://api.github.com/repos/pydata/xarray/issues/4746,753430518,MDEyOklzc3VlQ29tbWVudDc1MzQzMDUxOA==,5635139,2021-01-02T04:37:26Z,2021-01-02T04:37:26Z,MEMBER,"I'm not sure whether I'm making some very basic mistake, but I'm seeing what seems like a very surprising error. After the most recent commit, which seems to do very little apart from import `pint` iff it's available: https://github.com/pydata/xarray/pull/4746/commits/b33adedbfbd92df0f4188568691c7e2915bf8c19, I'm getting a lot of pint errors, unrelated to `unstack` / `stack`. Here's the results from the run prior: https://dev.azure.com/xarray/xarray/_build/results?buildId=4650&view=ms.vss-test-web.build-test-results-tab And from this run: https://dev.azure.com/xarray/xarray/_build/results?buildId=4651&view=ms.vss-test-web.build-test-results-tab Any ideas what's happening? As ever, 30% chance that I made a obvious typo that I can't see... Thanks in advance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753427050,https://api.github.com/repos/pydata/xarray/issues/4746,753427050,MDEyOklzc3VlQ29tbWVudDc1MzQyNzA1MA==,5635139,2021-01-02T03:56:19Z,2021-01-02T03:56:19Z,MEMBER,Thank you @jthielen !,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753425993,https://api.github.com/repos/pydata/xarray/issues/4746,753425993,MDEyOklzc3VlQ29tbWVudDc1MzQyNTk5Mw==,5635139,2021-01-02T03:43:06Z,2021-01-02T03:43:06Z,MEMBER,"I imagine I'm making some basic error here, but what's the best approach for evaluating whether an array is a pint array? `isinstance(self.data, unit_registry.Quantity)` returns False, though seems to be what we do in `test_units.py`? ``` (Pdb) unit_registry (Pdb) unit_registry.Quantity .Quantity'> (Pdb) isinstance(self.data, unit_registry.Quantity) False (Pdb) self.data (Pdb) self.data.__class__ .Quantity'> ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753425173,https://api.github.com/repos/pydata/xarray/issues/4746,753425173,MDEyOklzc3VlQ29tbWVudDc1MzQyNTE3Mw==,5635139,2021-01-02T03:30:35Z,2021-01-02T03:30:35Z,MEMBER,"Thanks for responding @jthielen . Yes, so `full_like` isn't creating a pint array: ``` (Pdb) self.data (Pdb) np.full_like(self.data, fill_value=fill_value) array([nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]) (Pdb) fill_value nan ``` I do think this is a bit surprising — while `fill_value` isn't typed, it's compatible with the existing type. For the moment, I'll direct pint arrays to take the existing code path — I'm more confident that we don't want to special-case pint in the `unstack` routine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753406775,https://api.github.com/repos/pydata/xarray/issues/4746,753406775,MDEyOklzc3VlQ29tbWVudDc1MzQwNjc3NQ==,5635139,2021-01-02T00:03:00Z,2021-01-02T00:03:00Z,MEMBER,"Would anyone be able be familiar with this pint error? https://dev.azure.com/xarray/xarray/_build/results?buildId=4650&view=ms.vss-test-web.build-test-results-tab&runId=73654&resultId=111454&paneView=debug It seems to be failing on the assignment: `data[(..., *indexer)] = reordered`, rather than anything specific to unstacking. Here's the stack trace from there: ```python /home/vsts/work/1/s/xarray/core/variable.py:1627: in _unstack_once data[(..., *indexer)] = reordered /home/vsts/work/1/s/xarray/core/common.py:131: in __array__ return np.asarray(self.values, dtype=dtype) /home/vsts/work/1/s/xarray/core/variable.py:543: in values return _as_array_or_item(self._data) /home/vsts/work/1/s/xarray/core/variable.py:275: in _as_array_or_item data = np.asarray(data) /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/numpy/core/_asarray.py:83: in asarray return array(a, dtype, copy=False, order=order) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t = None def __array__(self, t=None): > warnings.warn( ""The unit of the quantity is stripped when downcasting to ndarray."", UnitStrippedWarning, stacklevel=2, ) E pint.errors.UnitStrippedWarning: The unit of the quantity is stripped when downcasting to ndarray. /usr/share/miniconda/envs/xarray-tests/lib/python3.8/site-packages/pint/quantity.py:1683: UnitStrippedWarning ``` Worst case, I can direct pint arrays to the existing unstack path, but ideally this would work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753391636,https://api.github.com/repos/pydata/xarray/issues/4746,753391636,MDEyOklzc3VlQ29tbWVudDc1MzM5MTYzNg==,5635139,2021-01-01T22:03:19Z,2021-01-01T22:03:19Z,MEMBER,"Great, thanks, I'm making that change. Is there any need to keep the `sparse` kwarg? My inclination is to remove it and retain types — so to get a sparse array back, convert to sparse before unstacking?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550 https://github.com/pydata/xarray/pull/4746#issuecomment-753225789,https://api.github.com/repos/pydata/xarray/issues/4746,753225789,MDEyOklzc3VlQ29tbWVudDc1MzIyNTc4OQ==,5635139,2020-12-31T23:38:33Z,2020-12-31T23:38:33Z,MEMBER,"Any ideas on how sparse arrays should be handled in `unstack`? Currently we use reindex, so this seems to pass through without much effort on our part. In the new code, we're creating an array with `np.full` and then assigning to the appropriate locations. Can we do something similar that's not dependent on the underlying numpy / sparse / dask storage?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,777153550