html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/7139#issuecomment-1273284963,https://api.github.com/repos/pydata/xarray/issues/7139,1273284963,IC_kwDOAMm_X85L5Mlj,21131639,2022-10-10T13:04:30Z,2022-10-10T13:04:30Z,CONTRIBUTOR,"Based on your suggestion above I tried this single line fix which resolved my issue: https://github.com/pydata/xarray/pull/7150

However I'm not sure if this is the correct approach, since I'm not all to deeply familiar with the indexing model.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1400949778
https://github.com/pydata/xarray/issues/7139#issuecomment-1272922855,https://api.github.com/repos/pydata/xarray/issues/7139,1272922855,IC_kwDOAMm_X85L30Ln,21131639,2022-10-10T07:59:01Z,2022-10-10T07:59:01Z,CONTRIBUTOR,"Here is the full stacktrace:

```python
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In [12], line 7
----> 7 loaded = xr.open_dataset(""multiindex.nc"", engine=""netcdf4-multiindex"", handle_multiindex=True)
      8 print(loaded)

File ~/.local/share/virtualenvs/test-oePfdNug/lib/python3.8/site-packages/xarray/backends/api.py:537, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, backend_kwargs, **kwargs)
    530 overwrite_encoded_chunks = kwargs.pop(""overwrite_encoded_chunks"", None)
    531 backend_ds = backend.open_dataset(
    532     filename_or_obj,
    533     drop_variables=drop_variables,
    534     **decoders,
    535     **kwargs,
    536 )
--> 537 ds = _dataset_from_backend_dataset(
    538     backend_ds,
    539     filename_or_obj,
    540     engine,
    541     chunks,
    542     cache,
    543     overwrite_encoded_chunks,
    544     inline_array,
    545     drop_variables=drop_variables,
    546     **decoders,
    547     **kwargs,
    548 )
    549 return ds

File ~/.local/share/virtualenvs/test-oePfdNug/lib/python3.8/site-packages/xarray/backends/api.py:345, in _dataset_from_backend_dataset(backend_ds, filename_or_obj, engine, chunks, cache, overwrite_encoded_chunks, inline_array, **extra_tokens)
    340 if not isinstance(chunks, (int, dict)) and chunks not in {None, ""auto""}:
    341     raise ValueError(
    342         f""chunks must be an int, dict, 'auto', or None. Instead found {chunks}.""
    343     )
--> 345 _protect_dataset_variables_inplace(backend_ds, cache)
    346 if chunks is None:
    347     ds = backend_ds

File ~/.local/share/virtualenvs/test-oePfdNug/lib/python3.8/site-packages/xarray/backends/api.py:239, in _protect_dataset_variables_inplace(dataset, cache)
    237 if cache:
    238     data = indexing.MemoryCachedArray(data)
--> 239 variable.data = data

File ~/.local/share/virtualenvs/test-oePfdNug/lib/python3.8/site-packages/xarray/core/variable.py:2795, in IndexVariable.data(self, data)
   2793 @Variable.data.setter  # type: ignore[attr-defined]
   2794 def data(self, data):
-> 2795     raise ValueError(
   2796         f""Cannot assign to the .data attribute of dimension coordinate a.k.a IndexVariable {self.name!r}. ""
   2797         f""Please use DataArray.assign_coords, Dataset.assign_coords or Dataset.assign as appropriate.""
   2798     )

ValueError: Cannot assign to the .data attribute of dimension coordinate a.k.a IndexVariable 'measurement'. Please use DataArray.assign_coords, Dataset.assign_coords or Dataset.assign as appropriate.
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1400949778