issue_comments
7 rows where author_association = "CONTRIBUTOR" and user = 10819524 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: issue_url, created_at (date), updated_at (date)
user 1
- Zeitsperre · 7 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
1243917954 | https://github.com/pydata/xarray/issues/7012#issuecomment-1243917954 | https://api.github.com/repos/pydata/xarray/issues/7012 | IC_kwDOAMm_X85KJK6C | Zeitsperre 10819524 | 2022-09-12T15:33:10Z | 2022-09-12T15:33:10Z | CONTRIBUTOR | @mathause You're right! I noticed this first in my builds using "upstream" dependencies (xarray@main, flox@main, cftime@master, bottleneck@master). It might indeed be flox-related! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Time-based resampling drops lat/lon coordinate metadata 1367029446 | |
678423562 | https://github.com/pydata/xarray/issues/4054#issuecomment-678423562 | https://api.github.com/repos/pydata/xarray/issues/4054 | MDEyOklzc3VlQ29tbWVudDY3ODQyMzU2Mg== | Zeitsperre 10819524 | 2020-08-21T18:16:16Z | 2020-08-21T18:16:16Z | CONTRIBUTOR | Just dicsovered that the same things is true for |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Type checking fails for multiplication 617140674 | |
677920840 | https://github.com/pydata/xarray/issues/4054#issuecomment-677920840 | https://api.github.com/repos/pydata/xarray/issues/4054 | MDEyOklzc3VlQ29tbWVudDY3NzkyMDg0MA== | Zeitsperre 10819524 | 2020-08-20T21:41:51Z | 2020-08-20T22:48:49Z | CONTRIBUTOR | We're currently working on a library largely based on xarray and have seen the same types of errors from mypy (PR in our project that is currently trying to integrate mypy: https://github.com/Ouranosinc/xclim/pull/532). Currently working off of xarray v0.16. I also want to note this error is raised for other operations as well ( |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Type checking fails for multiplication 617140674 | |
491853311 | https://github.com/pydata/xarray/pull/2957#issuecomment-491853311 | https://api.github.com/repos/pydata/xarray/issues/2957 | MDEyOklzc3VlQ29tbWVudDQ5MTg1MzMxMQ== | Zeitsperre 10819524 | 2019-05-13T14:46:50Z | 2019-05-13T14:46:50Z | CONTRIBUTOR | This PR addresses https://github.com/Ouranosinc/xclim/issues/199 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
DOC: Added xclim to related projects 443440217 | |
462422387 | https://github.com/pydata/xarray/issues/2417#issuecomment-462422387 | https://api.github.com/repos/pydata/xarray/issues/2417 | MDEyOklzc3VlQ29tbWVudDQ2MjQyMjM4Nw== | Zeitsperre 10819524 | 2019-02-11T17:41:47Z | 2019-02-11T17:41:47Z | CONTRIBUTOR | Hi @jhamman, please excuse the lateness of this reply. It turned out that in the end all I needed to do was set |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Limiting threads/cores used by xarray(/dask?) 361016974 | |
453203293 | https://github.com/pydata/xarray/issues/2664#issuecomment-453203293 | https://api.github.com/repos/pydata/xarray/issues/2664 | MDEyOklzc3VlQ29tbWVudDQ1MzIwMzI5Mw== | Zeitsperre 10819524 | 2019-01-10T18:30:48Z | 2019-01-10T18:30:48Z | CONTRIBUTOR | That certainly is the error. The workaround identified for it is good enough for now. Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Xarray fails to build with bottleneck on Travis CI 397950349 | |
422445732 | https://github.com/pydata/xarray/issues/2417#issuecomment-422445732 | https://api.github.com/repos/pydata/xarray/issues/2417 | MDEyOklzc3VlQ29tbWVudDQyMjQ0NTczMg== | Zeitsperre 10819524 | 2018-09-18T15:44:03Z | 2018-09-18T15:44:03Z | CONTRIBUTOR | As per your suggestion, I retried with chunking and found a new error (due to the nature of my data having rotated poles, dask demanded that I save my data with astype(); this isn't my major concern so I'll deal with that somewhere else). What I did notice was that when chunking was specified ( This is really a mystery and unfortunately, I haven't a clue how this beahviour is possible if parallel processing is disabled by default. The speed of my results when dask multprocessing isn't specified suggests that it must be using more processing power:
Could these spikes in CPU usage be due to other processes (e.g. memory usage, I/O)? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Limiting threads/cores used by xarray(/dask?) 361016974 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
issue 5