issue_comments
9 rows where author_association = "MEMBER" and issue = 146079798 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- modified: xarray/backends/api.py · 9 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
1411179632 | https://github.com/pydata/xarray/pull/817#issuecomment-1411179632 | https://api.github.com/repos/pydata/xarray/issues/817 | IC_kwDOAMm_X85UHORw | jhamman 2443309 | 2023-01-31T22:51:57Z | 2023-01-31T22:51:57Z | MEMBER | Closing this as stale and out of date with our current backends. @swnesbitt (or others) - feel free to open a new PR if you feel there is more to do here. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
388967090 | https://github.com/pydata/xarray/pull/817#issuecomment-388967090 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDM4ODk2NzA5MA== | shoyer 1217238 | 2018-05-14T21:21:45Z | 2018-05-14T21:21:45Z | MEMBER | The only way we could make reading a gzipped netCDF4 file is to load the entire file into memory. That's why we didn't support this before. It's also less relevant for netCDF4, because netCDF4 supports in-file compression directly. With netCDF3, we can use scipy's netcdf reader, which supports Python file objects. But netCDF4-Python does not support Python file objects. This issue is concerned about supporting paths ending with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
388412798 | https://github.com/pydata/xarray/pull/817#issuecomment-388412798 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDM4ODQxMjc5OA== | jhamman 2443309 | 2018-05-11T16:20:26Z | 2018-05-11T16:20:26Z | MEMBER | @tsaoyu - I don't think anyone has worked on developing a test case for this feature. I assume @swnesbitt would appreciate help there. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
206465453 | https://github.com/pydata/xarray/pull/817#issuecomment-206465453 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIwNjQ2NTQ1Mw== | shoyer 1217238 | 2016-04-06T17:01:45Z | 2017-07-13T18:55:12Z | MEMBER | Currently the way we handle this is that the only test that accessing remote resources is the pydap test, which only runs in one build (not required to pass). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
218362023 | https://github.com/pydata/xarray/pull/817#issuecomment-218362023 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIxODM2MjAyMw== | shoyer 1217238 | 2016-05-11T04:59:16Z | 2016-05-11T04:59:16Z | MEMBER | I am reluctant to merge this without having any way to test the logic. Without automated tests this issue is likely to recur. That said, I suppose we could leave the refactoring for a TODO. Let's add a note on that and also one minimal test to verify that we raise an error if you try to use |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
218359192 | https://github.com/pydata/xarray/pull/817#issuecomment-218359192 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIxODM1OTE5Mg== | jhamman 2443309 | 2016-05-11T04:31:46Z | 2016-05-11T04:31:46Z | MEMBER | @shoyer - are we okay with the final logic here? @swnesbitt - can we get an entry in the what's new docs? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
206464547 | https://github.com/pydata/xarray/pull/817#issuecomment-206464547 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIwNjQ2NDU0Nw== | jhamman 2443309 | 2016-04-06T16:59:58Z | 2016-04-06T16:59:58Z | MEMBER |
I'd be a little hesitant to enforce that we get access to opendap (or other remote datasets) on travis. I've seen this come back to bite us in the past. We can allow tests to fail on travis and just print a warning via pytest if it is something we really want to see tested in that way. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
206463565 | https://github.com/pydata/xarray/pull/817#issuecomment-206463565 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIwNjQ2MzU2NQ== | shoyer 1217238 | 2016-04-06T16:56:45Z | 2016-04-06T16:56:45Z | MEMBER | We do have existing tests for backends: https://github.com/pydata/xarray/blob/master/xarray/test/test_backends.py This includes a test accessing an OpenDAP dataset from the OpenDAP test server (via pydap, at the end). But in my experience, there servers are somewhat unreliable (maybe available 90%), so we don't require that test to pass for the build to pass. Also, even in the best case scenario network access is slow. So it would be nice to modularize this enough that our logic is testable without actually using opendap. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 | |
206159316 | https://github.com/pydata/xarray/pull/817#issuecomment-206159316 | https://api.github.com/repos/pydata/xarray/issues/817 | MDEyOklzc3VlQ29tbWVudDIwNjE1OTMxNg== | shoyer 1217238 | 2016-04-06T06:57:40Z | 2016-04-06T06:57:40Z | MEMBER | I think this is probably correct, but the heuristics here are starting to get convoluted enough that I worry about test coverage. Is there any way we can test this? Maybe try to pull the gzip logic into a helper function (an extended variant of |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
modified: xarray/backends/api.py 146079798 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 2