home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

7 rows where author_association = "NONE", issue = 621177286 and user = 65610153 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • EliT1626 · 7 ✖

issue 1

  • "write to read-only" Error in xarray.open_mfdataset() with opendap datasets · 7 ✖

author_association 1

  • NONE · 7 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
639145573 https://github.com/pydata/xarray/issues/4082#issuecomment-639145573 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzOTE0NTU3Mw== EliT1626 65610153 2020-06-04T22:11:55Z 2020-06-04T22:11:55Z NONE

Thanks for putting the time in to figure this out! I appreciate it. Tried it myself and it worked, just as you mentioned.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632819541 https://github.com/pydata/xarray/issues/4082#issuecomment-632819541 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjgxOTU0MQ== EliT1626 65610153 2020-05-22T17:29:02Z 2020-05-22T17:29:02Z NONE

After discussing this issue with someone who has a lot more knowledge than me, it seems that it may be pertinent to mention that I am using a Windows machine. He is able to run the script fine on his Linux environment, much like some of you have been able to do. I have tried changing thewindow to different amounts and the script always fails around 25ish calls to the Opendap server. This was done on a new environment with only the required packages installed and updated to the latest versions. Is there some sort of issue with Windows in this regard?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632378632 https://github.com/pydata/xarray/issues/4082#issuecomment-632378632 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjM3ODYzMg== EliT1626 65610153 2020-05-21T22:28:56Z 2020-05-21T22:28:56Z NONE

It turns out I did mix the two. I uninstalled netcdf4 from pip and reinstalled via conda, but now I am back to the original error from my first post.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632264658 https://github.com/pydata/xarray/issues/4082#issuecomment-632264658 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjI2NDY1OA== EliT1626 65610153 2020-05-21T18:19:18Z 2020-05-21T18:19:18Z NONE

Okay, I updated all packages in my current environment. I am getting a new error now.

``` AttributeError Traceback (most recent call last) <ipython-input-2-598b402f7000> in <module> 18 date_window = list_dates(cur_date - window, cur_date + window) 19 url_list = [url.format(x) for x in date_window] ---> 20 window_data=xr.open_mfdataset(url_list).sst 21 data.append(window_data.mean('time')) 22 print(data[-1])

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, kwargs) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in <listcomp>(.0) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime) 497 498 if engine is None: --> 499 engine = _get_default_engine(filename_or_obj, allow_remote=True) 500 if engine == "netcdf4": 501 store = backends.NetCDF4DataStore.open(

~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine(path, allow_remote) 145 def _get_default_engine(path, allow_remote=False): 146 if allow_remote and is_remote_uri(path): --> 147 engine = _get_default_engine_remote_uri() 148 elif is_grib_path(path): 149 engine = _get_default_engine_grib()

~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine_remote_uri() 46 def _get_default_engine_remote_uri(): 47 try: ---> 48 import netCDF4 # noqa: F401 49 50 engine = "netcdf4"

~\Anaconda3\lib\site-packages\netCDF4__init__.py in <module> 1 # init for netCDF4. package 2 # Docstring comes from extension module _netCDF4. ----> 3 from ._netCDF4 import * 4 # Need explicit imports for names beginning with underscores 5 from ._netCDF4 import doc, pdoc

include\membuf.pyx in init netCDF4._netCDF4()

AttributeError: type object 'netCDF4._netCDF4._MemBuf' has no attribute 'reduce_cython' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632258946 https://github.com/pydata/xarray/issues/4082#issuecomment-632258946 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjI1ODk0Ng== EliT1626 65610153 2020-05-21T18:07:05Z 2020-05-21T18:07:05Z NONE

Updating what specifically? Certain packages?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
632238093 https://github.com/pydata/xarray/issues/4082#issuecomment-632238093 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMjIzODA5Mw== EliT1626 65610153 2020-05-21T17:25:20Z 2020-05-21T17:25:32Z NONE

Update. I tried installing a virtual environment and running this script on older versions of both netCDF4 and xarray. The end result was that the script stopped working after only a few timesteps instead of the larger number it did before. Here is the full error traceback. Still have not been able to find any info on what this might mean. Would this also be worth posting on the netCDF4 Github page?

Error:

``` IOError Traceback (most recent call last) <ipython-input-2-598b402f7000> in <module>() 18 date_window = list_dates(cur_date - window, cur_date + window) 19 url_list = [url.format(x) for x in date_window] ---> 20 window_data=xr.open_mfdataset(url_list).sst 21 data.append(window_data.mean('time')) 22 print(data[-1])

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, autoclose, parallel, kwargs) 622 getattr_ = getattr 623 --> 624 datasets = [open_(p, open_kwargs) for p in paths] 625 file_objs = [getattr_(ds, '_file_obj') for ds in datasets] 626 if preprocess is not None:

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 318 group=group, 319 autoclose=autoclose, --> 320 **backend_kwargs) 321 elif engine == 'scipy': 322 store = backends.ScipyDataStore(filename_or_obj,

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in open(cls, filename, mode, format, group, writer, clobber, diskless, persist, autoclose, lock) 329 diskless=diskless, persist=persist, 330 format=format) --> 331 ds = opener() 332 return cls(ds, mode=mode, writer=writer, opener=opener, 333 autoclose=autoclose, lock=lock)

C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in _open_netcdf4_group(filename, mode, group, kwargs) 228 import netCDF4 as nc4 229 --> 230 ds = nc4.Dataset(filename, mode=mode, kwargs) 231 232 with close_on_error(ds):

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

IOError: [Errno -37] NetCDF: Write to read only: 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201703/oisst-avhrr-v02r01.20170304.nc' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286
631125403 https://github.com/pydata/xarray/issues/4082#issuecomment-631125403 https://api.github.com/repos/pydata/xarray/issues/4082 MDEyOklzc3VlQ29tbWVudDYzMTEyNTQwMw== EliT1626 65610153 2020-05-19T22:42:14Z 2020-05-19T22:42:14Z NONE

Yes, it is quite long though.

Error: ```

KeyError Traceback (most recent call last) ~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 197 try: --> 198 file = self._cache[self._key] 199 except KeyError:

~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in getitem(self, key) 52 with self._lock: ---> 53 value = self._cache[key] 54 self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]

During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last) <ipython-input-2-2402d81dac52> in <module> 17 date_window = list_dates(cur_date - window, cur_date + window) 18 url_list = [url.format(x) for x in date_window] ---> 19 window_data=xr.open_mfdataset(url_list).sst 20 data.append(window_data.mean('time')) 21 print(data[-1])

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, kwargs) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in <listcomp>(.0) 906 getattr_ = getattr 907 --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 file_objs = [getattr_(ds, "_file_obj") for ds in datasets] 910 if preprocess is not None:

~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime) 500 if engine == "netcdf4": 501 store = backends.NetCDF4DataStore.open( --> 502 filename_or_obj, group=group, lock=lock, **backend_kwargs 503 ) 504 elif engine == "scipy":

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose) 356 netCDF4.Dataset, filename, mode=mode, kwargs=kwargs 357 ) --> 358 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) 359 360 def _acquire(self, needs_lock=True):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in init(self, manager, group, mode, lock, autoclose) 312 self._group = group 313 self._mode = mode --> 314 self.format = self.ds.data_model 315 self._filename = self.ds.filepath() 316 self.is_remote = is_remote_uri(self._filename)

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in ds(self) 365 @property 366 def ds(self): --> 367 return self._acquire() 368 369 def open_store_variable(self, name, var):

~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _acquire(self, needs_lock) 359 360 def _acquire(self, needs_lock=True): --> 361 with self._manager.acquire_context(needs_lock) as root: 362 ds = _nc4_require_group(root, self._group, self._mode) 363 return ds

~\Anaconda3\lib\contextlib.py in enter(self) 110 del self.args, self.kwds, self.func 111 try: --> 112 return next(self.gen) 113 except StopIteration: 114 raise RuntimeError("generator didn't yield") from None

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in acquire_context(self, needs_lock) 184 def acquire_context(self, needs_lock=True): 185 """Context manager for acquiring a file.""" --> 186 file, cached = self._acquire_with_cache_info(needs_lock) 187 try: 188 yield file

~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock) 202 kwargs = kwargs.copy() 203 kwargs["mode"] = self._mode --> 204 file = self._opener(self._args, *kwargs) 205 if self._mode == "w": 206 # ensure file doesn't get overriden when opened again

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

OSError: [Errno -37] NetCDF: Write to read only: b'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  "write to read-only" Error in xarray.open_mfdataset() with opendap datasets 621177286

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 216.89ms · About: xarray-datasette