issue_comments
8 rows where issue = 970619131 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- Loading datasets of numpy string arrays leads to error and/or segfault · 8 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
1545346823 | https://github.com/pydata/xarray/issues/5706#issuecomment-1545346823 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X85cHB8H | kmuehlbauer 5821660 | 2023-05-12T08:06:06Z | 2023-05-12T08:06:06Z | MEMBER | This is resolved in recent |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1170812062 | https://github.com/pydata/xarray/issues/5706#issuecomment-1170812062 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X85FySye | kmuehlbauer 5821660 | 2022-06-30T06:17:49Z | 2022-06-30T06:17:49Z | MEMBER | Problem source identified in netcdf-c: https://github.com/Unidata/netcdf-c/issues/2159 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1012204673 | https://github.com/pydata/xarray/issues/5706#issuecomment-1012204673 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848VQSB | scottstanie 8291800 | 2022-01-13T14:48:08Z | 2022-01-13T14:48:08Z | CONTRIBUTOR | Sounds good, but it seems like you're correct that it's a netcdf/netcdf4-python problem here, so I'll defer to others as to what the best changes to default settings would be to avoid the segfaults |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1012189867 | https://github.com/pydata/xarray/issues/5706#issuecomment-1012189867 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848VMqr | kmuehlbauer 5821660 | 2022-01-13T14:31:31Z | 2022-01-13T14:31:31Z | MEMBER | @scottstanie I'll check my h5py/hdf5 settings. But I doubt that might be the difference. I've experienced that the trailing garbage is changing from run to run, sometimes disappearing. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1012132794 | https://github.com/pydata/xarray/issues/5706#issuecomment-1012132794 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848U-u6 | scottstanie 8291800 | 2022-01-13T13:23:45Z | 2022-01-13T13:23:45Z | CONTRIBUTOR | ah sorry, didn't see the request for ``` $ ncdump test_str_list.h5 netcdf test_str_list { dimensions: phony_dim_0 = 2 ; phony_dim_1 = 2 ; variables: string pairs(phony_dim_0, phony_dim_1) ; data: pairs =
"2020010120200201 ", NIL,
"2020010120200301 ", NIL ;
}
``` netcdf test_str_list_attr { // global attributes: string :NULLPAD = "20200101�<T��\007", "20200201", "20200101�=T��\007", "20200301" ; string :NULLTERM = "20200101", "20200201", "20200101", "20200301" ; string :numpy_S = "20200101", "20200201\1775T��\007", "20200101", "20200301�3T��\007" ; string :numpy_O = "20200101", "20200201", "20200101", "20200301" ; } ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1012003403 | https://github.com/pydata/xarray/issues/5706#issuecomment-1012003403 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848UfJL | kmuehlbauer 5821660 | 2022-01-13T10:31:25Z | 2022-01-13T10:31:25Z | MEMBER | @scottstanie Here is the output of ncdump: ``` netcdf test_str_list { dimensions: phony_dim_0 = 2 ; phony_dim_1 = 2 ; variables: string pairs(phony_dim_0, phony_dim_1) ; data: pairs = "2020010120200201�\f\033��U", NIL, "2020010120200301 ", NIL ; } ``` You see the trailing garbage. This is obviously a problem with netcdf-c/netcdf4-python, as it is not there with pure hdf5 (h5py/h5netcdf). But, there is a difference with Attributes and Datasets: ```pathon import h5py import xarray as xr with h5py.File("test_str_list_attr.h5", "w") as hf: sid = h5py.h5s.create_simple((2, 2), (2, 2)) tid1 = h5py.h5t.TypeID.copy(h5py.h5t.C_S1) tid1.set_size(8) tid1.set_strpad(h5py.h5t.STR_NULLPAD)
!h5dump test_str_list_attr.h5 !ncdump test_str_list_attr.h5 with xr.load_dataset("test_str_list_attr.h5", engine="h5netcdf", phony_dims="sort") as ds: display(ds) with xr.load_dataset("test_str_list_attr.h5", engine="netcdf4") as ds: display(ds) with nc.Dataset("test_str_list_attr.h5") as ds: display(ds) display(ds.NULLTERM) display(ds.NULLPAD) display(ds.numpy_O) display(ds.numpy_S) ```
```
HDF5 "test_str_list_attr.h5" {
GROUP "/" {
ATTRIBUTE "NULLPAD" {
DATATYPE H5T_STRING {
STRSIZE 8;
STRPAD H5T_STR_NULLPAD;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
ATTRIBUTE "NULLTERM" {
DATATYPE H5T_STRING {
STRSIZE 9;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
ATTRIBUTE "numpy_O" {
DATATYPE H5T_STRING {
STRSIZE H5T_VARIABLE;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
ATTRIBUTE "numpy_S" {
DATATYPE H5T_STRING {
STRSIZE 8;
STRPAD H5T_STR_NULLPAD;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
}
}
netcdf test_str_list_attr {
// global attributes:
string :NULLPAD = "20200101", "20200201", "20200101", "20200301" ;
string :NULLTERM = "20200101", "20200201", "20200101", "20200301" ;
string :numpy_S = "20200101", "20200201@�s}�U", "20200101", "20200301�6t}�U" ;
string :numpy_O = "20200101", "20200201", "20200101", "20200301" ;
}
<xarray.Dataset>
Dimensions: ()
Data variables:
*empty*
Attributes:
NULLPAD: [[b'20200101' b'20200201']\n [b'20200101' b'20200301']]
NULLTERM: [[b'20200101' b'20200201']\n [b'20200101' b'20200301']]
numpy_O: [['20200101' '20200201']\n ['20200101' '20200301']]
numpy_S: [[b'20200101' b'20200201']\n [b'20200101' b'20200301']]
<xarray.Dataset>
Dimensions: ()
Data variables:
*empty*
Attributes:
NULLPAD: ['20200101', '20200201', '20200101', '20200301']
NULLTERM: ['20200101', '20200201', '20200101', '20200301']
numpy_S: ['20200101', '20200201', '20200101p��i�U', '20200301']
numpy_O: ['20200101', '20200201', '20200101', '20200301']
<class 'netCDF4._netCDF4.Dataset'>
root group (NETCDF4 data model, file format HDF5):
NULLPAD: ['20200101', '20200201', '20200101', '20200301']
NULLTERM: ['20200101', '20200201', '20200101', '20200301']
numpy_S: ['20200101', '20200201', '20200101', '20200301']
numpy_O: ['20200101', '20200201', '20200101', '20200301']
dimensions(sizes):
variables(dimensions):
groups:
['20200101', '20200201', '20200101', '20200301']
['20200101', '20200201', '20200101', '20200301']
['20200101', '20200201', '20200101', '20200301']
['20200101', '20200201', '20200101', '20200301']
```
It's clearly seen, that the Datasets are correct in hdf5 dump, but somehow netcdf-c has issues with the string NULLPAD/NULLTERM. But at least there is no segfault with attributes. Othe than with Datasets/Variables: ```python import h5py import xarray as xr with h5py.File("test_str_list_ds.h5", "w") as hf: blob = np.array([["20200101", "20200201"], ["20200101", "20200301"]]).astype("S")
!h5dump test_str_list_ds.h5 !ncdump test_str_list_ds.h5 with xr.load_dataset("test_str_list_ds.h5", engine="h5netcdf", phony_dims="sort") as ds: display(ds) with xr.load_dataset("test_str_list_ds.h5", engine="netcdf4") as ds:display(ds["numpy_O"])with nc.Dataset("test_str_list_ds.h5") as ds:display(ds)#display("NULLTERM:", ds["NULLTERM"][:])#display("NULLPAD:", ds["NULLPAD"][:])display("numpy_O", ds["numpy_O"][:])#display("numpy_S", ds["numpy_S"][:])```
```
HDF5 "test_str_list_ds.h5" {
GROUP "/" {
DATASET "NULLPAD" {
DATATYPE H5T_STRING {
STRSIZE 8;
STRPAD H5T_STR_NULLPAD;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
DATASET "NULLTERM" {
DATATYPE H5T_STRING {
STRSIZE 9;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
DATASET "numpy_O" {
DATATYPE H5T_STRING {
STRSIZE H5T_VARIABLE;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
DATASET "numpy_S" {
DATATYPE H5T_STRING {
STRSIZE 8;
STRPAD H5T_STR_NULLPAD;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}
DATASPACE SIMPLE { ( 2, 2 ) / ( 2, 2 ) }
DATA {
(0,0): "20200101", "20200201",
(1,0): "20200101", "20200301"
}
}
}
}
netcdf test_str_list_ds {
dimensions:
phony_dim_0 = 2 ;
phony_dim_1 = 2 ;
variables:
string NULLPAD(phony_dim_0, phony_dim_1) ;
string NULLTERM(phony_dim_0, phony_dim_1) ;
string numpy_O(phony_dim_0, phony_dim_1) ;
string numpy_S(phony_dim_0, phony_dim_1) ;
data:
NULLPAD =
"2020010120200201�4k�U", NIL,
"2020010120200301 ", NIL ;
NULLTERM =
"20200101", NIL,
"20200101", NIL ;
numpy_O =
"20200101", "20200201",
"20200101", "20200301" ;
numpy_S =
"2020010120200201", NIL,
"2020010120200301 ", NIL ;
}
<xarray.Dataset>
Dimensions: (phony_dim_0: 2, phony_dim_1: 2)
Dimensions without coordinates: phony_dim_0, phony_dim_1
Data variables:
NULLPAD (phony_dim_0, phony_dim_1) |S8 b'20200101' ... b'20200301'
NULLTERM (phony_dim_0, phony_dim_1) |S9 b'20200101' ... b'20200301'
numpy_O (phony_dim_0, phony_dim_1) object '20200101' ... '20200301'
numpy_S (phony_dim_0, phony_dim_1) |S8 b'20200101' ... b'20200301'
```
So here, netcdf-c/netcdf4-python will segfault for all variables beside It looks like the only option to achieve this for datasets/variables is to use numpy opaque dtype. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1011556328 | https://github.com/pydata/xarray/issues/5706#issuecomment-1011556328 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848Sx_o | scottstanie 8291800 | 2022-01-12T23:51:07Z | 2022-01-12T23:53:01Z | CONTRIBUTOR | sure! here it is:
(and just to include the specific traceback that hapened now, in case my versions are different from what I showed):
In [4]: import h5py
...: import xarray as xr
...:
...: with h5py.File("test_str_list.h5", "w") as hf:
...: hf["pairs"] = np.array([["20200101", "20200201"], ["20200101", "20200301"]]).astype("S")
...:
...: ds = xr.load_dataset("test_str_list.h5")
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/xarray/backends/plugins.py:68: RuntimeWarning: Engine 'cfgrib' loading failed:
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/gribapi/_bindings.cpython-38-x86_64-linux-gnu.so: undefined symbol: codes_bufr_key_is_header
warnings.warn(f"Engine {name!r} loading failed:\n{ex}", RuntimeWarning)
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/fsspec/implementations/local.py:29: FutureWarning: The default value of auto_mkdir=True has been deprecated and will be changed to auto_mkdir=False by default in a future release.
warnings.warn(
*** Error in `/home/scott/miniconda3/envs/mapping/bin/python': free(): invalid next size (fast): 0x00005564b64622a0 ***
======= Backtrace: =========
/lib64/libc.so.6(+0x81679)[0x7f56e752b679]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/netCDF4/../../../libnetcdf.so.18(nc_free_string+0x25)[0x7f54cf53d1a5]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/netCDF4/_netCDF4.cpython-38-x86_64-linux-gnu.so(+0xcf3c8)[0x7f54cf7313c8]
/home/scott/miniconda3/envs/mapping/bin/python(PyCFunction_Call+0x54)[0x5564b397df44]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/netCDF4/_netCDF4.cpython-38-x86_64-linux-gnu.so(+0x224fd)[0x7f54cf6844fd]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/netCDF4/_netCDF4.cpython-38-x86_64-linux-gnu.so(+0x559d9)[0x7f54cf6b79d9]
/home/scott/miniconda3/envs/mapping/bin/python(PyObject_GetItem+0x45)[0x5564b39d7935]
/home/scott/miniconda3/envs/mapping/bin/python(+0x128e0b)[0x5564b397ae0b]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0x947)[0x5564b3a1ec77]
/home/scott/miniconda3/envs/mapping/bin/python(+0x1b0736)[0x5564b3a02736]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0x947)[0x5564b3a1ec77]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x1a6)[0x5564b3a01fc6]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0x4e03)[0x5564b3a23133]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x1a6)[0x5564b3a01fc6]
/home/scott/miniconda3/envs/mapping/bin/python(+0x1800cd)[0x5564b39d20cd]
/home/scott/miniconda3/envs/mapping/bin/python(PyObject_GetItem+0x45)[0x5564b39d7935]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0xd53)[0x5564b3a1f083]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalCodeWithName+0x2c3)[0x5564b3a00db3]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x378)[0x5564b3a02198]
/home/scott/miniconda3/envs/mapping/bin/python(+0x1b0841)[0x5564b3a02841]
/home/scott/miniconda3/envs/mapping/bin/python(+0x12404d)[0x5564b397604d]
/home/scott/miniconda3/envs/mapping/bin/python(_PyObject_CallFunction_SizeT+0x99)[0x5564b39761f9]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa11fd)[0x7f56dddfe1fd]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa54d7)[0x7f56dde024d7]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x8a2d5)[0x7f56ddde72d5]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x8adc4)[0x7f56ddde7dc4]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa559a)[0x7f56dde0259a]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa5ac9)[0x7f56dde02ac9]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x13f2b7)[0x7f56dde9c2b7]
/home/scott/miniconda3/envs/mapping/bin/python(+0x129082)[0x5564b397b082]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0x181e)[0x5564b3a1fb4e]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalCodeWithName+0x2c3)[0x5564b3a00db3]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x378)[0x5564b3a02198]
/home/scott/miniconda3/envs/mapping/bin/python(+0x1b0841)[0x5564b3a02841]
/home/scott/miniconda3/envs/mapping/bin/python(+0x12404d)[0x5564b397604d]
/home/scott/miniconda3/envs/mapping/bin/python(_PyObject_CallFunction_SizeT+0x99)[0x5564b39761f9]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa11fd)[0x7f56dddfe1fd]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa54d7)[0x7f56dde024d7]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x8a2d5)[0x7f56ddde72d5]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x8adc4)[0x7f56ddde7dc4]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa559a)[0x7f56dde0259a]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa5ac9)[0x7f56dde02ac9]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x13f2b7)[0x7f56dde9c2b7]
/home/scott/miniconda3/envs/mapping/bin/python(+0x129082)[0x5564b397b082]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0x4e03)[0x5564b3a23133]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x1a6)[0x5564b3a01fc6]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalFrameDefault+0xa63)[0x5564b3a1ed93]
/home/scott/miniconda3/envs/mapping/bin/python(_PyEval_EvalCodeWithName+0x2c3)[0x5564b3a00db3]
/home/scott/miniconda3/envs/mapping/bin/python(_PyFunction_Vectorcall+0x378)[0x5564b3a02198]
/home/scott/miniconda3/envs/mapping/bin/python(+0x1b0841)[0x5564b3a02841]
/home/scott/miniconda3/envs/mapping/bin/python(+0x12404d)[0x5564b397604d]
/home/scott/miniconda3/envs/mapping/bin/python(_PyObject_CallFunction_SizeT+0x99)[0x5564b39761f9]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa11fd)[0x7f56dddfe1fd]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0xa54d7)[0x7f56dde024d7]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so(+0x8a2d5)[0x7f56ddde72d5]
/home/scott/miniconda3/envs/mapping/lib/python3.8/site-packages/numpy/core/_multiarray_umath.cpython-38-x86_64-linux-gnu.so
Aborted (core dumped)
xr.show_versions
In [2]: xr.show_versions()
INSTALLED VERSIONS
------------------
commit: None
python: 3.8.12 | packaged by conda-forge | (default, Oct 12 2021, 21:59:51)
[GCC 9.4.0]
python-bits: 64
OS: Linux
OS-release: 3.10.0-1062.4.1.el7.x86_64
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.10.6
libnetcdf: 4.7.4
xarray: 0.20.2
pandas: 1.1.0
numpy: 1.21.2
scipy: 1.5.3
netCDF4: 1.5.4
pydap: None
h5netcdf: 0.11.0
h5py: 3.2.1
Nio: None
zarr: 2.8.3
cftime: 1.2.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: 1.2.6
cfgrib: None
iris: None
bottleneck: 1.3.2
dask: 2021.01.0
distributed: 2.20.0
matplotlib: 3.3.1
cartopy: 0.19.0.post1
seaborn: None
numbagg: None
fsspec: 0.6.3
cupy: 9.0.0
pint: 0.17
sparse: None
setuptools: 50.3.2
pip: 21.2.4
conda: 4.8.4
pytest: 6.2.4
IPython: 7.18.1
sphinx: 4.0.2
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 | |
1011242728 | https://github.com/pydata/xarray/issues/5706#issuecomment-1011242728 | https://api.github.com/repos/pydata/xarray/issues/5706 | IC_kwDOAMm_X848Rlbo | kmuehlbauer 5821660 | 2022-01-12T16:43:33Z | 2022-01-12T16:43:33Z | MEMBER | @scottstanie Could you please provide the output of |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Loading datasets of numpy string arrays leads to error and/or segfault 970619131 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 2