html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/817#issuecomment-1411179632,https://api.github.com/repos/pydata/xarray/issues/817,1411179632,IC_kwDOAMm_X85UHORw,2443309,2023-01-31T22:51:57Z,2023-01-31T22:51:57Z,MEMBER,Closing this as stale and out of date with our current backends. @swnesbitt (or others) - feel free to open a new PR if you feel there is more to do here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-388967090,https://api.github.com/repos/pydata/xarray/issues/817,388967090,MDEyOklzc3VlQ29tbWVudDM4ODk2NzA5MA==,1217238,2018-05-14T21:21:45Z,2018-05-14T21:21:45Z,MEMBER,"The only way we could make reading a gzipped netCDF4 file is to load the entire file into memory. That's why we didn't support this before. It's also less relevant for netCDF4, because netCDF4 supports in-file compression directly.
With netCDF3, we can use scipy's netcdf reader, which supports Python file objects. But netCDF4-Python does not support Python file objects.
This issue is concerned about supporting paths ending with `.gz` in remote URLs, which are not local files but rather files exposed via the OpenDAP protocol over a network. If the DAP server can server a file with a `.gz` extension then xarray should be OK with it, too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-388749495,https://api.github.com/repos/pydata/xarray/issues/817,388749495,MDEyOklzc3VlQ29tbWVudDM4ODc0OTQ5NQ==,6488896,2018-05-14T09:11:08Z,2018-05-14T09:11:08Z,NONE,"I am trying to understand the logic by looking through the comments/discussion on this commit.
After two years of development, there are quite some changes on the API itself e.g. the logic on the gzipped file handling seems already moved from `api.py`
to `scipy_.py`. So I think it would be better to start a new commit from the latest version.
To bring the concept on the same table, there are 3 backends that can be used to handle gzipped netCDF file namely `scipy`, `netcdf4` and `pydap`. `scipy` backend only support netCDF3 so far and that is the only implemented method to handle gzipped netCDF3 file. `netcdf4` and `pydap` should able to deal with netCDF4 file given the file is first opened by `gzip.open` (as it was implement in `scipy_.py`).
So what we need to do is to make both `netCDF4_.py` and `pydap_.py` aware of the file is end with `.gz` and first open it using `gzip.open`. Following tests also need to be done to verify the implementation of the logic. I have looked at vcrpy and it seems a nice idea to speed up tests with HTTP stuff involved. I must confess I only use xarray with local file so it may takes quite a while to understand the issue. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-388440871,https://api.github.com/repos/pydata/xarray/issues/817,388440871,MDEyOklzc3VlQ29tbWVudDM4ODQ0MDg3MQ==,221526,2018-05-11T18:04:18Z,2018-05-11T18:04:18Z,CONTRIBUTOR,"Regarding the testing issue, another option is to use something like vcrpy to record and playback http responses for opendap requests. I've had good luck with that for Siphon.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-388412798,https://api.github.com/repos/pydata/xarray/issues/817,388412798,MDEyOklzc3VlQ29tbWVudDM4ODQxMjc5OA==,2443309,2018-05-11T16:20:26Z,2018-05-11T16:20:26Z,MEMBER,@tsaoyu - I don't think anyone has worked on developing a test case for this feature. I assume @swnesbitt would appreciate help there. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-388386642,https://api.github.com/repos/pydata/xarray/issues/817,388386642,MDEyOklzc3VlQ29tbWVudDM4ODM4NjY0Mg==,6488896,2018-05-11T14:49:48Z,2018-05-11T14:49:48Z,NONE,How is this issue progressed so far? I am running into the same problem now and I might able to offer some help in testing. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206465453,https://api.github.com/repos/pydata/xarray/issues/817,206465453,MDEyOklzc3VlQ29tbWVudDIwNjQ2NTQ1Mw==,1217238,2016-04-06T17:01:45Z,2017-07-13T18:55:12Z,MEMBER,"Currently the way we handle this is that the only test that accessing
remote resources is the pydap test, which only runs in one build (not
required to pass).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-218362023,https://api.github.com/repos/pydata/xarray/issues/817,218362023,MDEyOklzc3VlQ29tbWVudDIxODM2MjAyMw==,1217238,2016-05-11T04:59:16Z,2016-05-11T04:59:16Z,MEMBER,"I am reluctant to merge this without having any way to test the logic. Without automated tests this issue is likely to recur.
That said, I suppose we could leave the refactoring for a TODO. Let's add a note on that and also one minimal test to verify that we raise an error if you try to use `engine='netcdf4'` with a path to local gzipped file.
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-218359192,https://api.github.com/repos/pydata/xarray/issues/817,218359192,MDEyOklzc3VlQ29tbWVudDIxODM1OTE5Mg==,2443309,2016-05-11T04:31:46Z,2016-05-11T04:31:46Z,MEMBER,"@shoyer - are we okay with the final logic here?
@swnesbitt - can we get an entry in the what's new docs?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206470775,https://api.github.com/repos/pydata/xarray/issues/817,206470775,MDEyOklzc3VlQ29tbWVudDIwNjQ3MDc3NQ==,3288592,2016-04-06T17:15:57Z,2016-04-06T17:15:57Z,NONE,"OK - understood. Incidentally, I didn't include pydap because it doesn't seem to be python 3.\* compatible. I tested it on 2.7 for my application and it seems to work.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206464547,https://api.github.com/repos/pydata/xarray/issues/817,206464547,MDEyOklzc3VlQ29tbWVudDIwNjQ2NDU0Nw==,2443309,2016-04-06T16:59:58Z,2016-04-06T16:59:58Z,MEMBER,"> Regarding testing, it would be good to have a test module that includes grabbing files for local access and accessing openDAP datasets (from UNIDATA?) to ensure that the installed backends are all working as expected.
I'd be a little hesitant to enforce that we get access to opendap (or other remote datasets) on travis. I've seen this come back to bite us in the past. We can allow tests to fail on travis and just print a warning via pytest if it is something we really want to see tested in that way.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206463565,https://api.github.com/repos/pydata/xarray/issues/817,206463565,MDEyOklzc3VlQ29tbWVudDIwNjQ2MzU2NQ==,1217238,2016-04-06T16:56:45Z,2016-04-06T16:56:45Z,MEMBER,"We do have existing tests for backends: https://github.com/pydata/xarray/blob/master/xarray/test/test_backends.py
This includes a test accessing an OpenDAP dataset from the OpenDAP test server (via pydap, at the end). But in my experience, there servers are somewhat unreliable (maybe available 90%), so we don't require that test to pass for the build to pass. Also, even in the best case scenario network access is slow. So it would be nice to modularize this enough that our logic is testable without actually using opendap.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206461095,https://api.github.com/repos/pydata/xarray/issues/817,206461095,MDEyOklzc3VlQ29tbWVudDIwNjQ2MTA5NQ==,3288592,2016-04-06T16:49:24Z,2016-04-06T16:49:24Z,NONE,"Certainly can take a crack at this. This is working for my application, but would be good to clean up the code to simplify the cases as you suggest.
Regarding testing, it would be good to have a test module that includes grabbing files for local access and accessing openDAP datasets (from UNIDATA?) to ensure that the installed backends are all working as expected.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798
https://github.com/pydata/xarray/pull/817#issuecomment-206159316,https://api.github.com/repos/pydata/xarray/issues/817,206159316,MDEyOklzc3VlQ29tbWVudDIwNjE1OTMxNg==,1217238,2016-04-06T06:57:40Z,2016-04-06T06:57:40Z,MEMBER,"I think this is probably correct, but the heuristics here are starting to get convoluted enough that I worry about test coverage. Is there any way we can test this? Maybe try to pull the gzip logic into a helper function (an extended variant of `_get_default_engine`) that we could test?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,146079798