home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

5 rows where issue = 1520760951 and user = 8382834 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • jerabaul29 · 5 ✖

issue 1

  • Opening a variable as several chunks works fine, but opening it "fully" crashes · 5 ✖

author_association 1

  • CONTRIBUTOR 5
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1373697191 https://github.com/pydata/xarray/issues/7421#issuecomment-1373697191 https://api.github.com/repos/pydata/xarray/issues/7421 IC_kwDOAMm_X85R4PSn jerabaul29 8382834 2023-01-06T14:13:52Z 2023-01-06T14:13:52Z CONTRIBUTOR

Creating a conda environment as you suggest, I am fully able to read etc the file, so this solves my issue. Many thanks! Then I guess this means there is some weird issue leading to segfaults on this file with some of the older libnetcdf versions. Closing as using a conda env and a more recent stack fixes things.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Opening a variable as several chunks works fine, but opening it "fully" crashes 1520760951
1373600032 https://github.com/pydata/xarray/issues/7421#issuecomment-1373600032 https://api.github.com/repos/pydata/xarray/issues/7421 IC_kwDOAMm_X85R33kg jerabaul29 8382834 2023-01-06T13:09:29Z 2023-01-06T13:09:29Z CONTRIBUTOR

Ok, thanks, this does crash too on my machine. Then likely something to do with my software stack somewhere, I will try with a new mamba / conda environment and check if this fixes things.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Opening a variable as several chunks works fine, but opening it "fully" crashes 1520760951
1373587759 https://github.com/pydata/xarray/issues/7421#issuecomment-1373587759 https://api.github.com/repos/pydata/xarray/issues/7421 IC_kwDOAMm_X85R30kv jerabaul29 8382834 2023-01-06T12:57:38Z 2023-01-06T12:57:38Z CONTRIBUTOR

@keewis regarding the engine: I have netcdf4 installed and I do not provide a dedicated engine in the open_dataset command, so I guess this is using the netcdf4 engine by default? Anyway I can run a command to double check and confirm to you? :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Opening a variable as several chunks works fine, but opening it "fully" crashes 1520760951
1373584045 https://github.com/pydata/xarray/issues/7421#issuecomment-1373584045 https://api.github.com/repos/pydata/xarray/issues/7421 IC_kwDOAMm_X85R3zqt jerabaul29 8382834 2023-01-06T12:53:18Z 2023-01-06T12:53:18Z CONTRIBUTOR

@keewis interesting. Just to be sure: I am able to open the dataset just fine too, the issue arises when trying to actually read the field, i.e.:

python xr_file = xr.open_dataset(input_file, decode_times=False)

is just fine, but

python xr_file["accD"][0, 0:3235893].data

is what segfaults; just to be sure there is no misunderstanding, are you actually able to run the last command without isse? :)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Opening a variable as several chunks works fine, but opening it "fully" crashes 1520760951
1373580808 https://github.com/pydata/xarray/issues/7421#issuecomment-1373580808 https://api.github.com/repos/pydata/xarray/issues/7421 IC_kwDOAMm_X85R3y4I jerabaul29 8382834 2023-01-06T12:49:05Z 2023-01-06T12:49:05Z CONTRIBUTOR

I got help to extract more information in gdb; converting the ipynb to a py file and running it in gdb context:

```

jupyter nbconvert --to script issue_opening_2018_03_b.ipynb [NbConvertApp] Converting notebook issue_opening_2018_03_b.ipynb to script [NbConvertApp] Writing 1313 bytes to issue_opening_2018_03_b.py gdb --args python3 issue_opening_2018_03_b.py [...] (gdb) run Starting program: /usr/bin/python3 issue_opening_2018_03_b.py [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". [...] Thread 1 "python3" received signal SIGSEGV, Segmentation fault. __memmove_sse2_unaligned_erms () at ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S:314 314 ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S: No such file or directory. (gdb) bt

0 __memmove_sse2_unaligned_erms () at ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S:314

1 0x00007ffff6af4bdc in NC4_get_vars () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0

2 0x00007ffff6af337d in NC4_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0

3 0x00007ffff6a959aa in NC_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0

4 0x00007ffff6a96b9b in nc_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0

5 0x00007ffff6ec24bc in ?? () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/_netCDF4.cpython-38-x86_64-linux-gnu.so

6 0x00000000005f5b39 in PyCFunction_Call ()

```

which seems to be originating in libnetcdf?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Opening a variable as several chunks works fine, but opening it "fully" crashes 1520760951

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 80.935ms · About: xarray-datasette