html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/4122#issuecomment-1453911083,https://api.github.com/repos/pydata/xarray/issues/4122,1453911083,IC_kwDOAMm_X85WqOwr,6042212,2023-03-03T18:12:01Z,2023-03-03T18:12:01Z,CONTRIBUTOR,"> what are the limitations of the netcdf3 standard vs netcdf4
No compression, encoding or chunking except for the one ""append"" dimension.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-1453902381,https://api.github.com/repos/pydata/xarray/issues/4122,1453902381,IC_kwDOAMm_X85WqMot,6042212,2023-03-03T18:04:29Z,2023-03-03T18:04:29Z,CONTRIBUTOR,"scipy only reads/writes netcdf2/3 ( https://docs.scipy.org/doc/scipy/reference/generated/scipy.io.netcdf_file.html ), which is a very different and simpler format than netcdf4. The latter uses HDF5 as a container, and h5netcdf as the xarray engine. I guess ""to_netcdf"" is ambiguous.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-1453898602,https://api.github.com/repos/pydata/xarray/issues/4122,1453898602,IC_kwDOAMm_X85WqLtq,6042212,2023-03-03T18:01:30Z,2023-03-03T18:01:30Z,CONTRIBUTOR,"> I use the engine=""scipy"" one for reading.
This is netCDF3, in that case. If that's fine for you, no problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-1453558039,https://api.github.com/repos/pydata/xarray/issues/4122,1453558039,IC_kwDOAMm_X85Wo4kX,6042212,2023-03-03T13:48:09Z,2023-03-03T13:48:09Z,CONTRIBUTOR,"Maybe it is netCDF3? xarray is supposed to be able to determine the file type
```
with fsspec.open(""s3://some_bucket/some_remote_destination.nc"", mode=""rb"") as ff:
ds = xr.open_dataset(ff)
```
but maybe play with the engine= argument.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-1400583499,https://api.github.com/repos/pydata/xarray/issues/4122,1400583499,IC_kwDOAMm_X85TezVL,6042212,2023-01-23T15:57:24Z,2023-01-23T15:57:24Z,CONTRIBUTOR,Would you mind writing out long-hand the version that worked and the version that didn't?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-1400545067,https://api.github.com/repos/pydata/xarray/issues/4122,1400545067,IC_kwDOAMm_X85Tep8r,6042212,2023-01-23T15:31:16Z,2023-01-23T15:31:16Z,CONTRIBUTOR,"I can confirm that something like the following does work, basically automating the ""write local and then push"" workflow:
```
import xarray as xr
import fsspec
ds = xr.open_dataset('http://geoport.usgs.esipfed.org/thredds/dodsC'
'/silt/usgs/Projects/stellwagen/CF-1.6/BUZZ_BAY/2651-A.cdf')
outfile = fsspec.open('simplecache::gcs://mdtemp/foo2.nc',
mode='wb')
with outfile as f:
ds.to_netcdf(f)
```
Unfortunately, directly writing to the remote file without a local cached file is not supported, because HDF5 does not write in a linear way.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-655298190,https://api.github.com/repos/pydata/xarray/issues/4122,655298190,MDEyOklzc3VlQ29tbWVudDY1NTI5ODE5MA==,1386642,2020-07-08T05:39:14Z,2020-07-08T05:39:14Z,CONTRIBUTOR,"I’ve run into this as well. It’s not pretty, but my usual work around is to write to a local temporary file and then upload with fsspec. I can never remember exactly which netCDF engine to use...","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856
https://github.com/pydata/xarray/issues/4122#issuecomment-639777701,https://api.github.com/repos/pydata/xarray/issues/4122,639777701,MDEyOklzc3VlQ29tbWVudDYzOTc3NzcwMQ==,6042212,2020-06-05T20:17:38Z,2020-06-05T20:17:38Z,CONTRIBUTOR,"The write feature for simplecache isn't released yet, of course.
It would be interesting if someone could subclass file and write locally with h5netcdf to see what kind of seeks it does. Is it popping back to some file header to update array sizes? Presumably it would need a fixed-size header to do that. Parquet and other cloud formats have the metadata at the footer exactly for this reason, so you only write once you know everything and you only ever move forward in the fie.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,631085856