home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 2059035193

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2059035193 I_kwDOAMm_X856umI5 8574 NetCDF writing error 16617720 closed 0     5 2023-12-29T03:43:10Z 2023-12-29T04:26:39Z 2023-12-29T04:07:56Z NONE      

What happened?

Was working on writing a to_netcdf dataset and came across the following issue that does not allow netCDF writing - doesn't seem to matter what dataset is used, whether compute is turned on or off, or engine choice which suggests something of a bug.

Reproducible example: ds = xr.tutorial.load_dataset("air_temperature") ds.to_netcdf('testing.nc')

Error Response `--------------------------------------------------------------------------- TypeError Traceback (most recent call last) Cell In[7], line 2 1 ds = xr.tutorial.load_dataset("air_temperature") ----> 2 ds.to_netcdf('testing.nc')

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/xarray/core/dataset.py:2310, in Dataset.to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 2307 encoding = {} 2308 from xarray.backends.api import to_netcdf -> 2310 return to_netcdf( # type: ignore # mypy cannot resolve the overloads:( 2311 self, 2312 path, 2313 mode=mode, 2314 format=format, 2315 group=group, 2316 engine=engine, 2317 encoding=encoding, 2318 unlimited_dims=unlimited_dims, 2319 compute=compute, 2320 multifile=False, 2321 invalid_netcdf=invalid_netcdf, 2322 )

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/xarray/backends/api.py:1279, in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1276 format = format.upper() # type: ignore[assignment] 1278 # handle scheduler specific logic -> 1279 scheduler = _get_scheduler() 1280 have_chunks = any(v.chunks is not None for v in dataset.variables.values()) 1282 autoclose = have_chunks and scheduler in ["distributed", "multiprocessing"]

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/xarray/backends/locks.py:86, in _get_scheduler(get, collection) 83 import dask 84 from dask.base import get_scheduler # noqa: F401 ---> 86 actual_get = get_scheduler(get, collection) 87 except ImportError: 88 return None

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/dask/base.py:1440, in get_scheduler(get, scheduler, collections, cls) 1437 raise ValueError(get_err_msg) 1439 try: -> 1440 from distributed import get_client 1442 return get_client().get 1443 except (ImportError, ValueError):

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/distributed/init.py:23 20 from dask.config import config # type: ignore 22 from distributed._version import get_versions ---> 23 from distributed.actor import Actor, ActorFuture, BaseActorFuture 24 from distributed.client import ( 25 Client, 26 CompatibleExecutor, (...) 35 wait, 36 ) 37 from distributed.core import Status, connect, rpc

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/distributed/actor.py:13 9 from typing import Generic, Literal, NoReturn, TypeVar 11 from tornado.ioloop import IOLoop ---> 13 from distributed.client import Future 14 from distributed.protocol import to_serialize 15 from distributed.utils import LateLoopEvent, iscoroutinefunction, sync, thread_state

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/distributed/client.py:117 94 from distributed.utils import ( 95 CancelledError, 96 LoopRunner, (...) 107 thread_state, 108 ) 109 from distributed.utils_comm import ( 110 WrappedKey, 111 gather_from_workers, (...) 115 unpack_remotedata, 116 ) --> 117 from distributed.worker import get_client, get_worker, secede 119 logger = logging.getLogger(name) 121 _global_clients: weakref.WeakValueDictionary[ 122 int, Client 123 ] = weakref.WeakValueDictionary()

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/distributed/worker.py:120 118 from distributed.utils_perf import disable_gc_diagnosis, enable_gc_diagnosis 119 from distributed.versions import get_versions --> 120 from distributed.worker_memory import ( 121 DeprecatedMemoryManagerAttribute, 122 DeprecatedMemoryMonitor, 123 WorkerDataParameter, 124 WorkerMemoryManager, 125 ) 126 from distributed.worker_state_machine import ( 127 AcquireReplicasEvent, 128 BaseWorker, (...) 152 WorkerState, 153 ) 155 if TYPE_CHECKING: 156 # FIXME import from typing (needs Python >=3.10)

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/site-packages/distributed/worker_memory.py:56 53 from distributed.nanny import Nanny 54 from distributed.worker import Worker ---> 56 WorkerDataParameter: TypeAlias = Union[ 57 # pre-initialized 58 MutableMapping[Key, object], 59 # constructor 60 Callable[[], MutableMapping[Key, object]], 61 # constructor, passed worker.local_directory 62 Callable[[str], MutableMapping[Key, object]], 63 # (constructor, kwargs to constructor) 64 tuple[Callable[..., MutableMapping[Key, object]], dict[str, Any]], 65 # initialize internally 66 None, 67 ] 69 worker_logger = logging.getLogger("distributed.worker.memory") 70 worker_logger.addFilter(RateLimiterFilter(r"Unmanaged memory use is high"))

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/typing.py:243, in _tp_cache.<locals>.inner(args, kwds) 241 except TypeError: 242 pass # All real errors (not unhashable args) are raised below. --> 243 return func(args, **kwds)

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/typing.py:316, in _SpecialForm.getitem(self, parameters) 314 @_tp_cache 315 def getitem(self, parameters): --> 316 return self._getitem(self, parameters)

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/typing.py:421, in Union(self, parameters) 419 msg = "Union[arg, ...]: each arg must be a type." 420 parameters = tuple(_type_check(p, msg) for p in parameters) --> 421 parameters = _remove_dups_flatten(parameters) 422 if len(parameters) == 1: 423 return parameters[0]

File /storage/local/raid/e1/z3a/personal/allen4jt/environments/conda/p39_nov23/lib/python3.9/typing.py:215, in _remove_dups_flatten(parameters) 213 params.append(p) 214 # Weed out strict duplicates, preserving the first of each occurrence. --> 215 all_params = set(params) 216 if len(all_params) < len(params): 217 new_params = []

TypeError: unhashable type: 'list'`

What did you expect to happen?

No response

Minimal Complete Verifiable Example

No response

MVCE confirmation

  • [x] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [X] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [x] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [x] New issue — a search of GitHub Issues suggests this is not a duplicate.
  • [x] Recent environment — the issue occurs with the latest version of xarray and its dependencies.

Relevant log output

Python ds = xr.tutorial.load_dataset("air_temperature") ds.to_netcdf('testing.nc')

Anything else we need to know?

No response

Environment

INSTALLED VERSIONS ------------------ commit: None python: 3.9.0 (default, Nov 15 2020, 14:28:56) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 5.15.0-86-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.9.3-development xarray: 2023.12.0 pandas: 2.1.3 numpy: 1.26.2 scipy: 1.11.4 netCDF4: 1.6.5 pydap: None h5netcdf: None h5py: None Nio: None zarr: 2.16.1 cftime: 1.6.3 nc_time_axis: None iris: None bottleneck: None dask: 2023.11.0 distributed: None matplotlib: 3.8.2 cartopy: 0.22.0 seaborn: 0.13.0 numbagg: None fsspec: 2023.10.0 cupy: None pint: 0.22 sparse: 0.14.0 flox: None numpy_groupies: None setuptools: 68.0.0 pip: 23.3.1 conda: None pytest: None mypy: None IPython: 8.18.0 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8574/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 2 rows from issues_id in issues_labels
  • 0 rows from issue in issue_comments
Powered by Datasette · Queries took 3.596ms · About: xarray-datasette