issue_comments
2 rows where author_association = "NONE" and issue = 361016974 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- Limiting threads/cores used by xarray(/dask?) · 2 ✖
| id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 460325261 | https://github.com/pydata/xarray/issues/2417#issuecomment-460325261 | https://api.github.com/repos/pydata/xarray/issues/2417 | MDEyOklzc3VlQ29tbWVudDQ2MDMyNTI2MQ== | andytraumueller 10809480 | 2019-02-04T16:57:27Z | 2019-02-04T20:07:09Z | NONE | hi, my testcode is running properly on 5 threads thanks for the help ```python import xarray as xr import os import numpy import sys import dask from multiprocessing.pool import ThreadPool dask-worker = --nthreads 1with dask.config.set(schedular='threads', pool=ThreadPool(5)): dset = xr.open_mfdataset("/data/Environmental_Data/Sea_Surface_Height//.nc", engine='netcdf4', concat_dim='time', chunks={"latitude":180,"longitude":360}) dset1 = dset["adt"]-dset["sla"] dset1.to_dataset(name = 'ssh_mean') dset["ssh_mean"] = dset1 dset = dset.drop("crs") dset = dset.drop("lat_bnds") dset = dset.drop("lon_bnds") dset = dset.drop("xarray_dataarray_variable") dset = dset.drop("nv") dset_all_over_monthly_mean = dset.groupby("time.month").mean(dim="time", skipna=True) dset_all_over_season1_mean = dset_all_over_monthly_mean.sel(month=[1,2,3]) dset_all_over_season1_mean.mean(dim="month",skipna=True) dset_all_over_season1_mean.to_netcdf("/data/Environmental_Data/dump/mean/all_over_season1_mean_ssh_copernicus_0.25deg_season1_data_mean.nc") ``` |
{
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Limiting threads/cores used by xarray(/dask?) 361016974 | |
| 460292772 | https://github.com/pydata/xarray/issues/2417#issuecomment-460292772 | https://api.github.com/repos/pydata/xarray/issues/2417 | MDEyOklzc3VlQ29tbWVudDQ2MDI5Mjc3Mg== | andytraumueller 10809480 | 2019-02-04T15:34:04Z | 2019-02-04T15:34:04Z | NONE | i am also interest, I am running a lot of critical processes and I want to at least have 5 cores idleing. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Limiting threads/cores used by xarray(/dask?) 361016974 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] (
[html_url] TEXT,
[issue_url] TEXT,
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[created_at] TEXT,
[updated_at] TEXT,
[author_association] TEXT,
[body] TEXT,
[reactions] TEXT,
[performed_via_github_app] TEXT,
[issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
ON [issue_comments] ([user]);
user 1