home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

6 rows where issue = 1206496679 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 3

  • max-sixty 3
  • javedali99 2
  • dcherian 1

author_association 2

  • MEMBER 4
  • NONE 2

issue 1

  • NotImplementedError: Don't yet support nd fancy indexing · 6 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1101519903 https://github.com/pydata/xarray/issues/6490#issuecomment-1101519903 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85Bp9wf max-sixty 5635139 2022-04-18T15:57:10Z 2022-04-18T15:57:10Z MEMBER

OK, cheers @javedali99 !

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679
1101518164 https://github.com/pydata/xarray/issues/6490#issuecomment-1101518164 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85Bp9VU javedali99 15319503 2022-04-18T15:54:30Z 2022-04-18T15:54:30Z NONE

Thanks @dcherian @max-sixty.

I solved the issue with re-installing xarray and cfgrib individually and re-starting the kernel.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679
1101504397 https://github.com/pydata/xarray/issues/6490#issuecomment-1101504397 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85Bp5-N dcherian 2448579 2022-04-18T15:35:23Z 2022-04-18T15:35:23Z MEMBER

```

subsetting the data based on boundary coordinates

ds_sel = ds2011_2014.isel(lon=(ds2011_2014.lon >= left) & (ds2011_2014.lon <= right), lat=(ds2011_2014.lat >= bottom) & (ds2011_2014.lat <= top), ) ```

Please use the sel method instead of a boolean mask. This will work for this dataset because lat and lon are dimension coordinates.

See https://docs.xarray.dev/en/stable/user-guide/indexing.html

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679
1101071306 https://github.com/pydata/xarray/issues/6490#issuecomment-1101071306 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85BoQPK max-sixty 5635139 2022-04-18T04:08:39Z 2022-04-18T04:08:39Z MEMBER

I'm getting a different error around the encoding — please could the MVCE not use external data? Check out the link on the label, or the issue template, for more tips. Thanks

```python --------------------------------------------------------------------------- KeyError Traceback (most recent call last) File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/file_manager.py:199, in CachingFileManager._acquire_with_cache_info(self, needs_lock) 198 try: --> 199 file = self._cache[self._key] 200 except KeyError: File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/lru_cache.py:53, in LRUCache.__getitem__(self, key) 52 with self._lock: ---> 53 value = self._cache[key] 54 self._cache.move_to_end(key) KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/home/jovyan/precip.V1.0.2014.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))] During handling of the above exception, another exception occurred: OSError Traceback (most recent call last) Input In [5], in <module> 1 # combine netcdf files ----> 2 ds2011_2014 = xr.open_mfdataset('precip.V1.0.*.nc', concat_dim='time', combine='nested', engine='netcdf4') File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/api.py:908, in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, data_vars, coords, combine, parallel, join, attrs_file, combine_attrs, **kwargs) 905 open_ = open_dataset 906 getattr_ = getattr --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 closers = [getattr_(ds, "_close") for ds in datasets] 910 if preprocess is not None: File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/api.py:908, in <listcomp>(.0) 905 open_ = open_dataset 906 getattr_ = getattr --> 908 datasets = [open_(p, **open_kwargs) for p in paths] 909 closers = [getattr_(ds, "_close") for ds in datasets] 910 if preprocess is not None: File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/api.py:495, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, backend_kwargs, *args, **kwargs) 483 decoders = _resolve_decoders_kwargs( 484 decode_cf, 485 open_backend_dataset_parameters=backend.open_dataset_parameters, (...) 491 decode_coords=decode_coords, 492 ) 494 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None) --> 495 backend_ds = backend.open_dataset( 496 filename_or_obj, 497 drop_variables=drop_variables, 498 **decoders, 499 **kwargs, 500 ) 501 ds = _dataset_from_backend_dataset( 502 backend_ds, 503 filename_or_obj, (...) 510 **kwargs, 511 ) 512 return ds File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:553, in NetCDF4BackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose) 532 def open_dataset( 533 self, 534 filename_or_obj, (...) 549 autoclose=False, 550 ): 552 filename_or_obj = _normalize_path(filename_or_obj) --> 553 store = NetCDF4DataStore.open( 554 filename_or_obj, 555 mode=mode, 556 format=format, 557 group=group, 558 clobber=clobber, 559 diskless=diskless, 560 persist=persist, 561 lock=lock, 562 autoclose=autoclose, 563 ) 565 store_entrypoint = StoreBackendEntrypoint() 566 with close_on_error(store): File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:382, in NetCDF4DataStore.open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose) 376 kwargs = dict( 377 clobber=clobber, diskless=diskless, persist=persist, format=format 378 ) 379 manager = CachingFileManager( 380 netCDF4.Dataset, filename, mode=mode, kwargs=kwargs 381 ) --> 382 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose) File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:330, in NetCDF4DataStore.__init__(self, manager, group, mode, lock, autoclose) 328 self._group = group 329 self._mode = mode --> 330 self.format = self.ds.data_model 331 self._filename = self.ds.filepath() 332 self.is_remote = is_remote_uri(self._filename) File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:391, in NetCDF4DataStore.ds(self) 389 @property 390 def ds(self): --> 391 return self._acquire() File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/netCDF4_.py:385, in NetCDF4DataStore._acquire(self, needs_lock) 384 def _acquire(self, needs_lock=True): --> 385 with self._manager.acquire_context(needs_lock) as root: 386 ds = _nc4_require_group(root, self._group, self._mode) 387 return ds File /srv/conda/envs/notebook/lib/python3.8/contextlib.py:113, in _GeneratorContextManager.__enter__(self) 111 del self.args, self.kwds, self.func 112 try: --> 113 return next(self.gen) 114 except StopIteration: 115 raise RuntimeError("generator didn't yield") from None File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/file_manager.py:187, in CachingFileManager.acquire_context(self, needs_lock) 184 @contextlib.contextmanager 185 def acquire_context(self, needs_lock=True): 186 """Context manager for acquiring a file.""" --> 187 file, cached = self._acquire_with_cache_info(needs_lock) 188 try: 189 yield file File /srv/conda/envs/notebook/lib/python3.8/site-packages/xarray/backends/file_manager.py:205, in CachingFileManager._acquire_with_cache_info(self, needs_lock) 203 kwargs = kwargs.copy() 204 kwargs["mode"] = self._mode --> 205 file = self._opener(*self._args, **kwargs) 206 if self._mode == "w": 207 # ensure file doesn't get overridden when opened again 208 self._mode = "a" File src/netCDF4/_netCDF4.pyx:2307, in netCDF4._netCDF4.Dataset.__init__() File src/netCDF4/_netCDF4.pyx:1925, in netCDF4._netCDF4._ensure_nc_success() OSError: [Errno -101] NetCDF: HDF error: b'/home/jovyan/precip.V1.0.2014.nc'
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679
1101061409 https://github.com/pydata/xarray/issues/6490#issuecomment-1101061409 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85BoN0h javedali99 15319503 2022-04-18T03:38:43Z 2022-04-18T03:38:43Z NONE

@javedali99 please could you supply an MVCE?

```python

download data

for yr in range(2011,2015): url = f'https://downloads.psl.noaa.gov/Datasets/cpc_us_precip/RT/precip.V1.0.{yr}.nc' savename = url.split('/')[-1] urllib.request.urlretrieve(url,savename)

combine netcdf files

ds2011_2014 = xr.open_mfdataset('precip.V1.0.*.nc', concat_dim='time', combine='nested')

coordinates

top = 40 bottom = 37 left = 258 right = 265.4

subsetting the data based on boundary coordinates

ds_sel = ds2011_2014.isel(lon=(ds2011_2014.lon >= left) & (ds2011_2014.lon <= right), lat=(ds2011_2014.lat >= bottom) & (ds2011_2014.lat <= top), ) ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679
1100958666 https://github.com/pydata/xarray/issues/6490#issuecomment-1100958666 https://api.github.com/repos/pydata/xarray/issues/6490 IC_kwDOAMm_X85Bn0vK max-sixty 5635139 2022-04-17T22:19:43Z 2022-04-17T22:19:43Z MEMBER

@javedali99 please could you supply an MVCE?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  NotImplementedError: Don't yet support nd fancy indexing 1206496679

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 14.018ms · About: xarray-datasette