html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/7789#issuecomment-1523870704,https://api.github.com/repos/pydata/xarray/issues/7789,1523870704,IC_kwDOAMm_X85a1Gvw,8382834,2023-04-26T18:30:58Z,2023-04-26T18:32:33Z,CONTRIBUTOR,"Just found the solution (ironic, I had been bumping my head into this for quite a while before writing this issue, but found the solution right after writing this): one needs to provide both ```account_name``` and ```sas_token``` together, the ```adlfs``` exception is actually pointing to the right issue, I was just confused. I.e., this works:
```
xr.open_mfdataset([filename], engine=""zarr"", storage_options={'account_name':AZURE_STORAGE_ACCOUNT_NAME, 'sas_token': AZURE_STORAGE_SAS})
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1685503657
https://github.com/pydata/xarray/issues/7421#issuecomment-1373697191,https://api.github.com/repos/pydata/xarray/issues/7421,1373697191,IC_kwDOAMm_X85R4PSn,8382834,2023-01-06T14:13:52Z,2023-01-06T14:13:52Z,CONTRIBUTOR,"Creating a conda environment as you suggest, I am fully able to read etc the file, so this solves my issue. Many thanks! Then I guess this means there is some weird issue leading to segfaults on this file with some of the older libnetcdf versions. Closing as using a conda env and a more recent stack fixes things.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1520760951
https://github.com/pydata/xarray/issues/7421#issuecomment-1373600032,https://api.github.com/repos/pydata/xarray/issues/7421,1373600032,IC_kwDOAMm_X85R33kg,8382834,2023-01-06T13:09:29Z,2023-01-06T13:09:29Z,CONTRIBUTOR,"Ok, thanks, this does crash too on my machine. Then likely something to do with my software stack somewhere, I will try with a new mamba / conda environment and check if this fixes things.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1520760951
https://github.com/pydata/xarray/issues/7421#issuecomment-1373587759,https://api.github.com/repos/pydata/xarray/issues/7421,1373587759,IC_kwDOAMm_X85R30kv,8382834,2023-01-06T12:57:38Z,2023-01-06T12:57:38Z,CONTRIBUTOR,"@keewis regarding the engine: I have netcdf4 installed and I do not provide a dedicated engine in the ```open_dataset``` command, so I guess this is using the netcdf4 engine by default? Anyway I can run a command to double check and confirm to you? :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1520760951
https://github.com/pydata/xarray/issues/7421#issuecomment-1373584045,https://api.github.com/repos/pydata/xarray/issues/7421,1373584045,IC_kwDOAMm_X85R3zqt,8382834,2023-01-06T12:53:18Z,2023-01-06T12:53:18Z,CONTRIBUTOR,"@keewis interesting. Just to be sure: I am able to open the dataset just fine too, the issue arises when trying to actually read the field, i.e.:
```python
xr_file = xr.open_dataset(input_file, decode_times=False)
```
is just fine, but
```python
xr_file[""accD""][0, 0:3235893].data
```
is what segfaults; just to be sure there is no misunderstanding, are you actually able to run the last command without isse? :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1520760951
https://github.com/pydata/xarray/issues/7421#issuecomment-1373580808,https://api.github.com/repos/pydata/xarray/issues/7421,1373580808,IC_kwDOAMm_X85R3y4I,8382834,2023-01-06T12:49:05Z,2023-01-06T12:49:05Z,CONTRIBUTOR,"I got help to extract more information in gdb; converting the ipynb to a py file and running it in gdb context:
```
> jupyter nbconvert --to script issue_opening_2018_03_b.ipynb [NbConvertApp] Converting notebook issue_opening_2018_03_b.ipynb to script
[NbConvertApp] Writing 1313 bytes to issue_opening_2018_03_b.py
> gdb --args python3 issue_opening_2018_03_b.py
[...]
(gdb) run
Starting program: /usr/bin/python3 issue_opening_2018_03_b.py
[Thread debugging using libthread_db enabled]
Using host libthread_db library ""/lib/x86_64-linux-gnu/libthread_db.so.1"".
[...]
Thread 1 ""python3"" received signal SIGSEGV, Segmentation fault.
__memmove_sse2_unaligned_erms () at ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S:314
314 ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S: No such file or directory.
(gdb) bt
#0 __memmove_sse2_unaligned_erms () at ../sysdeps/x86_64/multiarch/memmove-vec-unaligned-erms.S:314
#1 0x00007ffff6af4bdc in NC4_get_vars () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0
#2 0x00007ffff6af337d in NC4_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0
#3 0x00007ffff6a959aa in NC_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0
#4 0x00007ffff6a96b9b in nc_get_vara () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/.libs/libnetcdf-5e98d7e6.so.15.0.0
#5 0x00007ffff6ec24bc in ?? () from /home/jrmet/.local/lib/python3.8/site-packages/netCDF4/_netCDF4.cpython-38-x86_64-linux-gnu.so
#6 0x00000000005f5b39 in PyCFunction_Call ()
```
which seems to be originating in libnetcdf?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1520760951
https://github.com/pydata/xarray/issues/7363#issuecomment-1340951101,https://api.github.com/repos/pydata/xarray/issues/7363,1340951101,IC_kwDOAMm_X85P7Uo9,8382834,2022-12-07T13:14:56Z,2022-12-07T13:14:56Z,CONTRIBUTOR,"(really feeling bad about missing your nice suggestion @headtr1ck , I must find a better way to jump between computer / smartphone / tablet and not miss some comments :see_no_evil: . again thanks for all the help).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1479121713
https://github.com/pydata/xarray/issues/7363#issuecomment-1340947035,https://api.github.com/repos/pydata/xarray/issues/7363,1340947035,IC_kwDOAMm_X85P7Tpb,8382834,2022-12-07T13:12:28Z,2022-12-07T13:12:28Z,CONTRIBUTOR,"Oooh I am so sorry @headtr1ck , apologies. I am using a lot the email received messages to check things out, and your message and the one from @keewis arrived in the same time and I missed yours. Really sorry, many thanks for pointing to this first, my bad.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1479121713
https://github.com/pydata/xarray/issues/7363#issuecomment-1340939982,https://api.github.com/repos/pydata/xarray/issues/7363,1340939982,IC_kwDOAMm_X85P7R7O,8382834,2022-12-07T13:07:10Z,2022-12-07T13:07:10Z,CONTRIBUTOR,"Following the pointer by @keewis, I just did an:
```
extended_observations = previous_observations.pad(pad_width={""time"": (0, needed_padding)}, mode=""constant"", constant_values=-999)
```
This runs nearly instantaneously and does exactly what I need. Many thanks to all for your help, and sorry for missing that there was the pad function. I close for now (the only question, is why the call to reindex is costly on my machine; I wonder if there may be some old version of some underlying software at stake).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1479121713
https://github.com/pydata/xarray/issues/7363#issuecomment-1340816132,https://api.github.com/repos/pydata/xarray/issues/7363,1340816132,IC_kwDOAMm_X85P6zsE,8382834,2022-12-07T11:14:03Z,2022-12-07T11:14:03Z,CONTRIBUTOR,"Aaah, you are right @keewis , pad should do exactly what I need :) . Many thanks. Interesting, I did spend a bit of time looking for this, somehow I could not find it - it is always hard to find the correct function to use when not knowing exactly what name to look for in advance :) .
Then I will check the use of ```pad``` this afternoon and I think this will fit my need. Still not sure why reindex was so problematic on my machine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1479121713
https://github.com/pydata/xarray/issues/7363#issuecomment-1340806519,https://api.github.com/repos/pydata/xarray/issues/7363,1340806519,IC_kwDOAMm_X85P6xV3,8382834,2022-12-07T11:06:21Z,2022-12-07T11:06:21Z,CONTRIBUTOR,"Yes, this is representative of my dataset :) .
Ok, interesting. I start this on my machine (Ubuntu 20.04, with 16GB of RAM, 15.3GB reported by the system as max available for memory).
- I start at around 6GB used, ie 9.3 GB available
- I run the script, in ipython3, after a few seconds my machine exhausts RAM and freezes, then the process gets killed:
```
[ins] In [1]: import numpy as np
...: import xarray as xr
...: import datetime
...:
...: # create two timeseries', second is for reindex
...: itime = np.arange(0, 3208464).astype(""