home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

8 rows where author_association = "NONE" and user = 65610153 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date), updated_at (date)

issue 2

  • "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 7
  • "write to read-only" Error in xarray.dataset_to_netcdf() 1

user 1

  • EliT1626 · 8 ✖

author_association 1

  • NONE · 8 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
644253463 https://github.com/pydata/xarray/issues/4157#issuecomment-644253463 https://api.github.com/repos/pydata/xarray/issues/4157 MDEyOklzc3VlQ29tbWVudDY0NDI1MzQ2Mw== EliT1626 65610153 2020-06-15T16:58:51Z 2020-06-15T16:58:51Z NONE

Code Sample

``` xr.set_options(file_cache_maxsize=10)

Assumes daily increments

def list_dates(start, end): num_days = (end - start).days return [start + dt.timedelta(days=x) for x in range(num_days)]

def list_dates1(start, end): num_days = (end - start).days dates = [start + dt.timedelta(days=x) for x in range(num_days)] sorted_dates = sorted(dates, key=lambda date: (date.month, date.day)) grouped_dates = [list(g) for _, g in groupby(sorted_dates, key=lambda date: (date.month, date.day))] return grouped_dates

start_date = dt.date(2010, 1, 1) end_date = dt.date(2019, 12, 31) date_list = list_dates1(start_date, end_date) window1 = dt.timedelta(days=5) window2 = dt.timedelta(days=6)

url = 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/{0:%Y%m}/oisst-avhrr-v02r01.{0:%Y%m%d}.nc' end_date2 = dt.date(2010, 1, 2)

sst_mean=[] cur_date = start_date

for cur_date in date_list: sst_mean_calc = [] for i in cur_date: date_window=list_dates(i - window1, i + window2) url_list_window = [url.format(x) for x in date_window] window_data=xr.open_mfdataset(url_list_window).sst sst_mean_calc.append(window_data.mean('time'))
sst_mean.append(xr.concat(sst_mean_calc, dim='time').mean('time')) cur_date+=cur_date if cur_date[0] >= end_date2: break else: continue

sst_mean_climo_test=xr.concat(sst_mean, dim='time')

sst_std=xr.concat(sst_std_calc, dim=pd.DatetimeIndex(date_list, name='time'))

sst_min = xr.concat(sst_min_calc, dim=pd.DatetimeIndex(date_list, name='time'))

sst_max = xr.concat(sst_max_calc, dim=pd.DatetimeIndex(date_list, name='time'))

sst_mean_climo_test.to_netcdf(path='E:/Riskpulse_HD/SST_stuff/sst_mean_climo_test') ``` Explanation of Code This code (climatology for SSTs) creates a list of dates between the specified start and end dates that contains the same day number for every month through the year span. For example, date_list[0] contains 10 datetime dates that start with 1-1-2010, 1-1-2011...1-1-2019. I then request OISST data from an opendap server and take a centered mean of the date in question (this case I did it for the first and second of January). In other words, I am opening the files for Dec 27-Jan 6 and averaging all of them together. The final xarray dataset then contains two 'times', which is 10 years worth of data for Jan 1 and Jan 2. I want to then send this to a netcdf file such that I can save it on my local machine and use to create plots down the road. Hope this makes sense.

Error Messages ``` KeyError Traceback (most recent call last) ~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 197 try: --> 198 file = self._cache[self._key] 199 except KeyError:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in getitem(self, key) 52 with self._lock: ---> 53 value = self._cache[key] 54 self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201801/oisst-avhrr-v02r01.20180106.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

RuntimeError Traceback (most recent call last) <ipython-input-3-f8395dcffb5e> in <module> 1 #xr.set_options(file_cache_maxsize=500) ----> 2 sst_mean_climo_test.to_netcdf(path='E:/Riskpulse_HD/SST_stuff/sst_mean_climo_test')

~\Anaconda3\lib\site-packages\xarray\core\dataarray.py in to_netcdf(self, args, kwargs) 2356 dataset = self.to_dataset() 2357 -> 2358 return dataset.to_netcdf(args, **kwargs) 2359 2360 def to_dict(self, data: bool = True) -> dict:

~\Anaconda3\lib\site-packages\xarray\core\dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 1552 unlimited_dims=unlimited_dims, 1553 compute=compute, -> 1554 invalid_netcdf=invalid_netcdf, 1555 ) 1556

~\Anaconda3\lib\site-packages\xarray\backends\api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1095 return writer, store 1096 -> 1097 writes = writer.sync(compute=compute) 1098 1099 if path_or_file is None:

~\Anaconda3\lib\site-packages\xarray\backends\common.py in sync(self, compute) 202 compute=compute, 203 flush=True, --> 204 regions=self.regions, 205 ) 206 self.sources = []

~\Anaconda3\lib\site-packages\dask\array\core.py in store(sources, targets, lock, regions, compute, return_stored, kwargs) 943 944 if compute: --> 945 result.compute(kwargs) 946 return None 947 else:

~\Anaconda3\lib\site-packages\dask\base.py in compute(self, kwargs) 164 dask.base.compute 165 """ --> 166 (result,) = compute(self, traverse=False, kwargs) 167 return result 168

~\Anaconda3\lib\site-packages\dask\base.py in compute(args, kwargs) 442 postcomputes.append(x.dask_postcompute()) 443 --> 444 results = schedule(dsk, keys, kwargs) 445 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) 446

~\Anaconda3\lib\site-packages\dask\threaded.py in get(dsk, result, cache, num_workers, pool, kwargs) 82 get_id=_thread_get_id, 83 pack_exception=pack_exception, ---> 84 kwargs 85 ) 86

~\Anaconda3\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs) 484 _execute_task(task, data) # Re-execute locally 485 else: --> 486 raise_exception(exc, tb) 487 res, worker_id = loads(res_info) 488 state["cache"][key] = res

~\Anaconda3\lib\site-packages\dask\local.py in reraise(exc, tb) 314 if exc.traceback is not tb: 315 raise exc.with_traceback(tb) --> 316 raise exc 317 318

~\Anaconda3\lib\site-packages\dask\local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 220 try: 221 task, data = loads(task_info) --> 222 result = _execute_task(task, data) 223 id = get_id() 224 result = dumps((result, id))

~\Anaconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 119 # temporaries by their reference count and can execute certain 120 # operations in-place. --> 121 return func(*(_execute_task(a, cache) for a in args)) 122 elif not ishashable(arg): 123 return arg

~\Anaconda3\lib\site-packages\dask\array\core.py in getter(a, b, asarray, lock) 98 c = a[b] 99 if asarray: --> 100 c = np.asarray(c) 101 finally: 102 if lock:

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 489 490 def array(self, dtype=None): --> 491 return np.asarray(self.array, dtype=dtype) 492 493 def getitem(self, key):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 651 652 def array(self, dtype=None): --> 653 return np.asarray(self.array, dtype=dtype) 654 655 def getitem(self, key):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 555 def array(self, dtype=None): 556 array = as_indexable(self.array) --> 557 return np.asarray(array[self.key], dtype=None) 558 559 def transpose(self, order):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 70 71 def array(self, dtype=None): ---> 72 return self.func(self.array) 73 74 def repr(self):

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in _scale_offset_decoding(data, scale_factor, add_offset, dtype) 216 217 def _scale_offset_decoding(data, scale_factor, add_offset, dtype): --> 218 data = np.array(data, dtype=dtype, copy=True) 219 if scale_factor is not None: 220 data *= scale_factor

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 70 71 def array(self, dtype=None): ---> 72 return self.func(self.array) 73 74 def repr(self):

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in _apply_mask(data, encoded_fill_values, decoded_fill_value, dtype) 136 ) -> np.ndarray: 137 """Mask all matching values in a NumPy arrays.""" --> 138 data = np.asarray(data, dtype=dtype) 139 condition = False 140 for fv in encoded_fill_values:

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 555 def array(self, dtype=None): 556 array = as_indexable(self.array) --> 557 return np.asarray(array[self.key], dtype=None) 558 559 def transpose(self, order):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in getitem(self, key) 71 def getitem(self, key): 72 return indexing.explicit_indexing_adapter( ---> 73 key, self.shape, indexing.IndexingSupport.OUTER, self._getitem 74 ) 75

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method) 835 """ 836 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support) --> 837 result = raw_indexing_method(raw_key.tuple) 838 if numpy_indices.tuple: 839 # index the loaded np.ndarray

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _getitem(self, key) 82 try: 83 with self.datastore.lock: ---> 84 original_array = self.get_array(needs_lock=False) 85 array = getitem(original_array, key) 86 except IndexError:

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in get_array(self, needs_lock) 61 62 def get_array(self, needs_lock=True): ---> 63 ds = self.datastore._acquire(needs_lock) 64 variable = ds.variables[self.variable_name] 65 variable.set_auto_maskandscale(False)

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _acquire(self, needs_lock) 359 360 def _acquire(self, needs_lock=True): --> 361 with self._manager.acquire_context(needs_lock) as root: 362 ds = _nc4_require_group(root, self._group, self._mode) 363 return ds

~\Anaconda3\lib\contextlib.py in enter(self) 110 del self.args, self.kwds, self.func 111 try: --> 112 return next(self.gen) 113 except StopIteration: 114 raise RuntimeError("generator didn't yield") from None

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in acquire_context(self, needs_lock) 184 def acquire_context(self, needs_lock=True): 185 """Context manager for acquiring a file.""" --> 186 file, cached = self._acquire_with_cache_info(needs_lock) 187 try: 188 yield file

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 206 # ensure file doesn't get overriden when opened again 207 self._mode = "a" --> 208 self._cache[self._key] = file 209 return file, False 210 else:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in setitem(self, key, value) 71 elif self._maxsize: 72 # make room if necessary ---> 73 self._enforce_size_limit(self._maxsize - 1) 74 self._cache[key] = value 75 elif self._on_evict is not None:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in _enforce_size_limit(self, capacity) 61 key, value = self._cache.popitem(last=False) 62 if self._on_evict is not None: ---> 63 self._on_evict(key, value) 64 65 def setitem(self, key: K, value: V) -> None:

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in <lambda>(k, v) 12 # Global cache for storing open files. 13 FILE_CACHE: LRUCache[str, io.IOBase] = LRUCache( ---> 14 maxsize=cast(int, OPTIONS["file_cache_maxsize"]), on_evict=lambda k, v: v.close() 15 ) 16 assert FILE_CACHE.maxsize, "file cache must be at least size one"

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.close()

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset._close()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

RuntimeError: NetCDF: HDF error `` **I also tried changing settingxr.set_options(file_cache_maxsize=500)` outside of the loop before trying to create the netcdf file and received this error:**

``` KeyError Traceback (most recent call last) ~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 197 try: --> 198 file = self._cache[self._key] 199 except KeyError:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in getitem(self, key) 52 with self._lock: ---> 53 value = self._cache[key] 54 self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201512/oisst-avhrr-v02r01.20151231.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last) <ipython-input-4-474cdce51e60> in <module> 1 xr.set_options(file_cache_maxsize=500) ----> 2 sst_mean_climo_test.to_netcdf(path='E:/Riskpulse_HD/SST_stuff/sst_mean_climo_test')

~\Anaconda3\lib\site-packages\xarray\core\dataarray.py in to_netcdf(self, args, kwargs) 2356 dataset = self.to_dataset() 2357 -> 2358 return dataset.to_netcdf(args, **kwargs) 2359 2360 def to_dict(self, data: bool = True) -> dict:

~\Anaconda3\lib\site-packages\xarray\core\dataset.py in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf) 1552 unlimited_dims=unlimited_dims, 1553 compute=compute, -> 1554 invalid_netcdf=invalid_netcdf, 1555 ) 1556

~\Anaconda3\lib\site-packages\xarray\backends\api.py in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf) 1095 return writer, store 1096 -> 1097 writes = writer.sync(compute=compute) 1098 1099 if path_or_file is None:

~\Anaconda3\lib\site-packages\xarray\backends\common.py in sync(self, compute) 202 compute=compute, 203 flush=True, --> 204 regions=self.regions, 205 ) 206 self.sources = []

~\Anaconda3\lib\site-packages\dask\array\core.py in store(sources, targets, lock, regions, compute, return_stored, kwargs) 943 944 if compute: --> 945 result.compute(kwargs) 946 return None 947 else:

~\Anaconda3\lib\site-packages\dask\base.py in compute(self, kwargs) 164 dask.base.compute 165 """ --> 166 (result,) = compute(self, traverse=False, kwargs) 167 return result 168

~\Anaconda3\lib\site-packages\dask\base.py in compute(args, kwargs) 442 postcomputes.append(x.dask_postcompute()) 443 --> 444 results = schedule(dsk, keys, kwargs) 445 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) 446

~\Anaconda3\lib\site-packages\dask\threaded.py in get(dsk, result, cache, num_workers, pool, kwargs) 82 get_id=_thread_get_id, 83 pack_exception=pack_exception, ---> 84 kwargs 85 ) 86

~\Anaconda3\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs) 484 _execute_task(task, data) # Re-execute locally 485 else: --> 486 raise_exception(exc, tb) 487 res, worker_id = loads(res_info) 488 state["cache"][key] = res

~\Anaconda3\lib\site-packages\dask\local.py in reraise(exc, tb) 314 if exc.traceback is not tb: 315 raise exc.with_traceback(tb) --> 316 raise exc 317 318

~\Anaconda3\lib\site-packages\dask\local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 220 try: 221 task, data = loads(task_info) --> 222 result = _execute_task(task, data) 223 id = get_id() 224 result = dumps((result, id))

~\Anaconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 119 # temporaries by their reference count and can execute certain 120 # operations in-place. --> 121 return func(*(_execute_task(a, cache) for a in args)) 122 elif not ishashable(arg): 123 return arg

~\Anaconda3\lib\site-packages\dask\array\core.py in getter(a, b, asarray, lock) 98 c = a[b] 99 if asarray: --> 100 c = np.asarray(c) 101 finally: 102 if lock:

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 489 490 def array(self, dtype=None): --> 491 return np.asarray(self.array, dtype=dtype) 492 493 def getitem(self, key):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 651 652 def array(self, dtype=None): --> 653 return np.asarray(self.array, dtype=dtype) 654 655 def getitem(self, key):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 555 def array(self, dtype=None): 556 array = as_indexable(self.array) --> 557 return np.asarray(array[self.key], dtype=None) 558 559 def transpose(self, order):

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 70 71 def array(self, dtype=None): ---> 72 return self.func(self.array) 73 74 def repr(self):

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in _scale_offset_decoding(data, scale_factor, add_offset, dtype) 216 217 def _scale_offset_decoding(data, scale_factor, add_offset, dtype): --> 218 data = np.array(data, dtype=dtype, copy=True) 219 if scale_factor is not None: 220 data *= scale_factor

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 70 71 def array(self, dtype=None): ---> 72 return self.func(self.array) 73 74 def repr(self):

~\Anaconda3\lib\site-packages\xarray\coding\variables.py in _apply_mask(data, encoded_fill_values, decoded_fill_value, dtype) 136 ) -> np.ndarray: 137 """Mask all matching values in a NumPy arrays.""" --> 138 data = np.asarray(data, dtype=dtype) 139 condition = False 140 for fv in encoded_fill_values:

~\Anaconda3\lib\site-packages\numpy\core_asarray.py in asarray(a, dtype, order) 83 84 """ ---> 85 return array(a, dtype, copy=False, order=order) 86 87

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 555 def array(self, dtype=None): 556 array = as_indexable(self.array) --> 557 return np.asarray(array[self.key], dtype=None) 558 559 def transpose(self, order):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in getitem(self, key) 71 def getitem(self, key): 72 return indexing.explicit_indexing_adapter( ---> 73 key, self.shape, indexing.IndexingSupport.OUTER, self._getitem 74 ) 75

~\Anaconda3\lib\site-packages\xarray\core\indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method) 835 """ 836 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support) --> 837 result = raw_indexing_method(raw_key.tuple) 838 if numpy_indices.tuple: 839 # index the loaded np.ndarray

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _getitem(self, key) 82 try: 83 with self.datastore.lock: ---> 84 original_array = self.get_array(needs_lock=False) 85 array = getitem(original_array, key) 86 except IndexError:

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in get_array(self, needs_lock) 61 62 def get_array(self, needs_lock=True): ---> 63 ds = self.datastore._acquire(needs_lock) 64 variable = ds.variables[self.variable_name] 65 variable.set_auto_maskandscale(False)

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _acquire(self, needs_lock) 359 360 def _acquire(self, needs_lock=True): --> 361 with self._manager.acquire_context(needs_lock) as root: 362 ds = _nc4_require_group(root, self._group, self._mode) 363 return ds

~\Anaconda3\lib\contextlib.py in enter(self) 110 del self.args, self.kwds, self.func 111 try: --> 112 return next(self.gen) 113 except StopIteration: 114 raise RuntimeError("generator didn't yield") from None

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in acquire_context(self, needs_lock) 184 def acquire_context(self, needs_lock=True): 185 """Context manager for acquiring a file.""" --> 186 file, cached = self._acquire_with_cache_info(needs_lock) 187 try: 188 yield file

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 202 kwargs = kwargs.copy() 203 kwargs["mode"] = self._mode --> 204 file = self._opener(self._args, *kwargs) 205 if self._mode == "w": 206 # ensure file doesn't get overriden when opened again

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

OSError: [Errno -37] NetCDF: Write to read only: b'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201512/oisst-avhrr-v02r01.20151231.nc' ``` I believe these errors have something to do with a post that I created a couple weeks ago (https://github.com/pydata/xarray/issues/4082).

I'm not sure if you can @ users on here, but @rsignell-usgs found out something about the caching before hand. It seems that this is some sort of Windows issue.

Versions python: 3.7.4 xarray: 0.15.1 pandas: 1.0.3 numpy: 1.18.1 scipy: 1.4.1 netcdf4: 1.4.2

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.dataset_to_netcdf() 638991453
639145573 https://github.com/pydata/xarray/issues/4082#issuecomment-639145573 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzOTE0NTU3Mw== EliT1626 65610153 2020-06-04T22:11:55Z 2020-06-04T22:11:55Z NONE

Thanks for putting the time in to figure this out! I appreciate it. Tried it myself and it worked, just as you mentioned.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632819541 https://github.com/pydata/xarray/issues/4082#issuecomment-632819541 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjgxOTU0MQ== EliT1626 65610153 2020-05-22T17:29:02Z 2020-05-22T17:29:02Z NONE

After discussing this issue with someone who has a lot more knowledge than me, it seems that it may be pertinent to mention that I am using a Windows machine. He is able to run the script fine on his Linux environment, much like some of you have been able to do. I have tried changing thewindow to different amounts and the script always fails around 25ish calls to the Opendap server. This was done on a new environment with only the required packages installed and updated to the latest versions. Is there some sort of issue with Windows in this regard?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632378632 https://github.com/pydata/xarray/issues/4082#issuecomment-632378632 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjM3ODYzMg== EliT1626 65610153 2020-05-21T22:28:56Z 2020-05-21T22:28:56Z NONE

It turns out I did mix the two. I uninstalled netcdf4 from pip and reinstalled via conda, but now I am back to the original error from my first post.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632264658 https://github.com/pydata/xarray/issues/4082#issuecomment-632264658 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjI2NDY1OA== EliT1626 65610153 2020-05-21T18:19:18Z 2020-05-21T18:19:18Z NONE

Okay, I updated all packages in my current environment. I am getting a new error now.

``` AttributeError Traceback (most recent call last) <ipython-input-2-598b402f7000> in <module> 18 date_window = list_dates(cur_date - window, cur_date + window) 19 url_list = [url.format(x) for x in date_window] ---> 20 window_data=xr.open_mfdataset(url_list).sst 21 data.append(window_data.mean('time')) 22 print(data[-1])

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, kwargs) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in <listcomp>(.0) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime) 497 498 if engine is None: --> 499 engine = _get_default_engine(filename_or_obj, allow_remote=True) 500 if engine == "netcdf4": 501 store = backends.NetCDF4DataStore.open(

~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine(path, allow_remote) 145 def _get_default_engine(path, allow_remote=False): 146 if allow_remote and is_remote_uri(path): --> 147 engine = _get_default_engine_remote_uri() 148 elif is_grib_path(path): 149 engine = _get_default_engine_grib()

~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine_remote_uri() 46 def _get_default_engine_remote_uri(): 47 try: ---> 48 import netCDF4 # noqa: F401 49 50 engine = "netcdf4"

~\Anaconda3\lib\site-packages\netCDF4__init__.py in <module> 1 # init for netCDF4. package 2 # Docstring comes from extension module _netCDF4. ----> 3 from ._netCDF4 import * 4 # Need explicit imports for names beginning with underscores 5 from ._netCDF4 import doc, pdoc

include\membuf.pyx in init netCDF4._netCDF4()

AttributeError: type object 'netCDF4._netCDF4._MemBuf' has no attribute 'reduce_cython' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632258946 https://github.com/pydata/xarray/issues/4082#issuecomment-632258946 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjI1ODk0Ng== EliT1626 65610153 2020-05-21T18:07:05Z 2020-05-21T18:07:05Z NONE

Updating what specifically? Certain packages?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632238093 https://github.com/pydata/xarray/issues/4082#issuecomment-632238093 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjIzODA5Mw== EliT1626 65610153 2020-05-21T17:25:20Z 2020-05-21T17:25:32Z NONE

Update. I tried installing a virtual environment and running this script on older versions of both netCDF4 and xarray. The end result was that the script stopped working after only a few timesteps instead of the larger number it did before. Here is the full error traceback. Still have not been able to find any info on what this might mean. Would this also be worth posting on the netCDF4 Github page?

Error:

``` IOError Traceback (most recent call last) <ipython-input-2-598b402f7000> in <module>() 18 date_window = list_dates(cur_date - window, cur_date + window) 19 url_list = [url.format(x) for x in date_window] ---> 20 window_data=xr.open_mfdataset(url_list).sst 21 data.append(window_data.mean('time')) 22 print(data[-1])

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, autoclose, parallel, kwargs) 622 getattr_ = getattr 623 --> 624 datasets = [open_(p, open_kwargs) for p in paths] 625 file_objs = [getattr_(ds, '_file_obj') for ds in datasets] 626 if preprocess is not None:

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 318 group=group, 319 autoclose=autoclose, --> 320 **backend_kwargs) 321 elif engine == 'scipy': 322 store = backends.ScipyDataStore(filename_or_obj,

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in open(cls, filename, mode, format, group, writer, clobber, diskless, persist, autoclose, lock) 329 diskless=diskless, persist=persist, 330 format=format) --> 331 ds = opener() 332 return cls(ds, mode=mode, writer=writer, opener=opener, 333 autoclose=autoclose, lock=lock)

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in _open_netcdf4_group(filename, mode, group, kwargs) 228 import netCDF4 as nc4 229 --> 230 ds = nc4.Dataset(filename, mode=mode, kwargs) 231 232 with close_on_error(ds):

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

IOError: [Errno -37] NetCDF: Write to read only: 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201703/oisst-avhrr-v02r01.20170304.nc' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
631125403 https://github.com/pydata/xarray/issues/4082#issuecomment-631125403 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMTEyNTQwMw== EliT1626 65610153 2020-05-19T22:42:14Z 2020-05-19T22:42:14Z NONE

Yes, it is quite long though.

Error: ```

KeyError Traceback (most recent call last) ~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 197 try: --> 198 file = self._cache[self._key] 199 except KeyError:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in getitem(self, key) 52 with self._lock: ---> 53 value = self._cache[key] 54 self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last) <ipython-input-2-2402d81dac52> in <module> 17 date_window = list_dates(cur_date - window, cur_date + window) 18 url_list = [url.format(x) for x in date_window] ---> 19 window_data=xr.open_mfdataset(url_list).sst 20 data.append(window_data.mean('time')) 21 print(data[-1])

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, kwargs) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in <listcomp>(.0) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime) 500 if engine == "netcdf4": 501 store = backends.NetCDF4DataStore.open( --> 502 filename_or_obj, group=group, lock=lock, **backend_kwargs 503 ) 504 elif engine == "scipy":

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose) 356 netCDF4.Dataset, filename, mode=mode, kwargs=kwargs 357 ) --> 358 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) 359 360 def _acquire(self, needs_lock=True):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in init(self, manager, group, mode, lock, autoclose) 312 self._group = group 313 self._mode = mode --> 314 self.format = self.ds.data_model 315 self._filename = self.ds.filepath() 316 self.is_remote = is_remote_uri(self._filename)

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in ds(self) 365 @property 366 def ds(self): --> 367 return self._acquire() 368 369 def open_store_variable(self, name, var):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _acquire(self, needs_lock) 359 360 def _acquire(self, needs_lock=True): --> 361 with self._manager.acquire_context(needs_lock) as root: 362 ds = _nc4_require_group(root, self._group, self._mode) 363 return ds

~\Anaconda3\lib\contextlib.py in enter(self) 110 del self.args, self.kwds, self.func 111 try: --> 112 return next(self.gen) 113 except StopIteration: 114 raise RuntimeError("generator didn't yield") from None

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in acquire_context(self, needs_lock) 184 def acquire_context(self, needs_lock=True): 185 """Context manager for acquiring a file.""" --> 186 file, cached = self._acquire_with_cache_info(needs_lock) 187 try: 188 yield file

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 202 kwargs = kwargs.copy() 203 kwargs["mode"] = self._mode --> 204 file = self._opener(self._args, *kwargs) 205 if self._mode == "w": 206 # ensure file doesn't get overriden when opened again

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

OSError: [Errno -37] NetCDF: Write to read only: b'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.398ms · About: xarray-datasette