home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

1 row where author_association = "CONTRIBUTOR", issue = 1056881922 and user = 8291800 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • scottstanie · 1 ✖

issue 1

  • Parallel access to DataArray within `with` statement causes `BlockingIOError` · 1 ✖

author_association 1

  • CONTRIBUTOR · 1 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1011734052 https://github.com/pydata/xarray/issues/6000#issuecomment-1011734052 https://api.github.com/repos/pydata/xarray/issues/6000 IC_kwDOAMm_X848TdYk scottstanie 8291800 2022-01-13T03:08:02Z 2022-01-13T03:08:02Z CONTRIBUTOR

Sorry, should have included that version of the error as well. Both give errors.

Here's when I use engine=h5netcdf ``` Traceback (most recent call last): File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/lru_cache.py", line 53, in getitem value = self._cache[key] KeyError: [<class 'h5netcdf.core.File'>, ('/home/scott/testdata.nc',), 'a', (('invalid_netcdf', None),)]

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "testxr.py", line 35, in <module> da_new.to_dataset(name="new_testdata").to_netcdf("testdata.nc", engine="h5netcdf") File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/core/dataset.py", line 1900, in to_netcdf return to_netcdf( File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/api.py", line 1060, in to_netcdf store = store_open(target, mode, format, group, kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf_.py", line 178, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf_.py", line 123, in init self.filename = find_root_and_group(self.ds)[0].filename File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf.py", line 189, in ds return self.acquire() File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf.py", line 181, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/home/scott/miniconda3/envs/mapping/lib/python3.8/contextlib.py", line 113, in enter return next(self.gen) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5netcdf/core.py", line 712, in init self._h5file = h5py.File(path, mode, **kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5py/_hl/files.py", line 406, in init fid = make_fid(name, mode, userblock_size, File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5py/_hl/files.py", line 179, in make_fid fid = h5f.create(name, h5f.ACC_TRUNC, fapl=fapl, fcpl=fcpl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 108, in h5py.h5f.create OSError: Unable to create file (unable to lock file, errno = 11, error message = 'Resource temporarily unavailable') ```

And here's the original ```$ python testxr.py Traceback (most recent call last): File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 199, in _acquire_with_cache_info file = self._cache[self._key] File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/lru_cache.py", line 53, in getitem value = self._cache[key] KeyError: [<class 'h5netcdf.core.File'>, ('/home/scott/testdata.nc',), 'a', (('invalid_netcdf', None),)]

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "testxr.py", line 35, in <module> da_new.to_dataset(name="new_testdata").to_netcdf("testdata.nc", engine="h5netcdf") File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/core/dataset.py", line 1900, in to_netcdf return to_netcdf( File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/api.py", line 1060, in to_netcdf store = store_open(target, mode, format, group, kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf_.py", line 178, in open return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf_.py", line 123, in init self.filename = find_root_and_group(self.ds)[0].filename File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf.py", line 189, in ds return self.acquire() File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/h5netcdf.py", line 181, in _acquire with self._manager.acquire_context(needs_lock) as root: File "/home/scott/miniconda3/envs/mapping/lib/python3.8/contextlib.py", line 113, in enter return next(self.gen) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 187, in acquire_context file, cached = self._acquire_with_cache_info(needs_lock) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/file_manager.py", line 205, in _acquire_with_cache_info file = self._opener(*self._args, kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5netcdf/core.py", line 712, in init self._h5file = h5py.File(path, mode, **kwargs) File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5py/_hl/files.py", line 406, in init fid = make_fid(name, mode, userblock_size, File "/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/h5py/_hl/files.py", line 179, in make_fid fid = h5f.create(name, h5f.ACC_TRUNC, fapl=fapl, fcpl=fcpl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 108, in h5py.h5f.create OSError: Unable to create file (unable to lock file, errno = 11, error message = 'Resource temporarily unavailable') ```

xr.show_versions()

>>> xarray.show_versions() INSTALLED VERSIONS ------------------ commit: None python: 3.8.5 (default, Sep 4 2020, 07:30:14) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 3.10.0-1127.19.1.el7.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.10.6 libnetcdf: 4.8.0 xarray: 0.19.0 pandas: 1.1.2 numpy: 1.21.2 scipy: 1.6.1 netCDF4: 1.5.7 pydap: None h5netcdf: 0.11.0 h5py: 2.10.0 Nio: None zarr: 2.8.3 cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.2.8 cfgrib: 0.9.8.5 iris: None bottleneck: 1.3.2 dask: 2.12.0 distributed: 2.25.0 matplotlib: 3.3.1 cartopy: 0.20.0 seaborn: None numbagg: None pint: 0.17 setuptools: 49.6.0.post20200814 pip: 20.0.2 conda: 4.8.4 pytest: None IPython: 7.18.1 sphinx: 4.0.2
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Parallel access to DataArray within `with` statement causes `BlockingIOError` 1056881922

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 13.126ms · About: xarray-datasette