issue_comments: 1268380486
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/pydata/xarray/issues/1077#issuecomment-1268380486 | https://api.github.com/repos/pydata/xarray/issues/1077 | 1268380486 | IC_kwDOAMm_X85LmfNG | 5230109 | 2022-10-05T12:38:05Z | 2022-10-05T12:38:05Z | NONE | Hi everyone, first of all, thanks for your amazing work! I came across this issue today because I have a dataset with multiple variables and multiple multi index dimensions, some of which aren't used in some variable. I had to slightly adapt the workaround posted by @dcherian to get things to work. I'll post it here if someone else finds the patch useful. I'm not sure if it would be a viable fix for the issue though, let me know if it is and I'll open a PR. ```python def encode_multiindex(ds, idxname): encoded = ds.reset_index(idxname) coords = dict(zip(ds.indexes[idxname].names, ds.indexes[idxname].levels)) for coord in coords: encoded[coord] = coords[coord].values shape = [encoded.sizes[coord] for coord in coords] encoded[idxname] = np.ravel_multi_index(ds.indexes[idxname].codes, shape) encoded[idxname].attrs["compress"] = " ".join(ds.indexes[idxname].names) return encoded def decode_to_multiindex(encoded, idxname): names = encoded[idxname].attrs["compress"].split(" ") shape = [encoded.sizes[dim] for dim in names] indices = np.unravel_index(encoded[idxname].values, shape) arrays = np.array([encoded[dim].values[index] for dim, index in zip(names, indices)]) mindex = pd.MultiIndex.from_arrays(arrays, names=names)
``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
187069161 |