issues
2 rows where type = "issue" and user = 25382032 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1210753839 | I_kwDOAMm_X85IKqMv | 6506 | HDF Netcdf4 error when writing to_netcdf | andreall 25382032 | closed | 0 | 2 | 2022-04-21T09:21:25Z | 2023-09-12T15:12:28Z | 2023-09-12T15:12:28Z | NONE | What is your issue?I have a dataarray of significant wave height data for the Mediterranean Sea and I'm trying to compute the monthly quantiles. I have used the following code for some models and it was working correctly:
Now I have a new model to analyse and it is failing:
I have tried updating xarray, netcdf4, h5netcdf; changing the engine, changing the encoding. Any thoughts? thanks! |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/6506/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
935818279 | MDU6SXNzdWU5MzU4MTgyNzk= | 5567 | quantile to_netcdf loading original data | andreall 25382032 | closed | 0 | 4 | 2021-07-02T14:23:27Z | 2021-07-05T15:48:50Z | 2021-07-05T15:48:50Z | NONE | I have a dataset from a RCM which I have opened using:
which is of shape (lon=180, lat=336, time=105193). I then perform the analysis:
ds then has shape (quantiles=106, lon=180, lat=336). I then want to save this data to file:
It gives me the following error: raceback (most recent call last): File "03_bias_adjustment_waves_nc.py", line 54, in get_quantiles ds.to_netcdf(dir_data / 'quantiles_hindc.nc') File "/opt/anaconda3/lib/python3.7/site-packages/xarray/core/dataarray.py", line 808, in load ds = self._to_temp_dataset().load(kwargs) File "/opt/anaconda3/lib/python3.7/site-packages/xarray/core/dataset.py", line 654, in load evaluated_data = da.compute(*lazy_data.values(), kwargs) File "/opt/anaconda3/lib/python3.7/site-packages/dask/base.py", line 452, in compute results = schedule(dsk, keys, kwargs) File "/opt/anaconda3/lib/python3.7/site-packages/dask/threaded.py", line 84, in get kwargs File "/opt/anaconda3/lib/python3.7/site-packages/dask/local.py", line 486, in get_async raise_exception(exc, tb) File "/opt/anaconda3/lib/python3.7/site-packages/dask/local.py", line 316, in reraise raise exc File "/opt/anaconda3/lib/python3.7/site-packages/dask/local.py", line 222, in execute_task result = _execute_task(task, data) File "/opt/anaconda3/lib/python3.7/site-packages/dask/core.py", line 121, in _execute_task return func((_execute_task(a, cache) for a in args)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/core.py", line 121, in <genexpr> return func((_execute_task(a, cache) for a in args)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/core.py", line 121, in _execute_task return func((_execute_task(a, cache) for a in args)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/core.py", line 121, in <genexpr> return func((_execute_task(a, cache) for a in args)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/core.py", line 121, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/array/core.py", line 4527, in concatenate_axes return concatenate3(transposelist(arrays, axes, extradims=extradims)) File "/opt/anaconda3/lib/python3.7/site-packages/dask/array/core.py", line 4510, in concatenate3 result = np.empty(shape=shape, dtype=dtype(deepfirst(arrays))) MemoryError: Unable to allocate 47.4 GiB for an array with shape (180, 336, 105193) and data type float64 I don't know why it is still keeping the original data on memory when I have replaced it with the quantiles dataset. I have also tried assigning it to a different variable ds_quantiles and doing ds.close() beforehand but it still gives the same error. Any ideas? Thanks |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5567/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);