home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

2 rows where state = "closed" and user = 2444231 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 1
  • pull 1

state 1

  • closed · 2 ✖

repo 1

  • xarray 2
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
579722569 MDExOlB1bGxSZXF1ZXN0Mzg3MDY0ODEz 3858 Backend env pgierz 2444231 closed 0     5 2020-03-12T06:30:28Z 2023-01-05T03:58:54Z 2023-01-05T03:58:54Z NONE   0 pydata/xarray/pulls/3858

This merge request allows the user to set a backend_env while opening a file. This should be a dictionary, and the key/value pairs are temporarily added to os.enviorn while opening the file. The old environment is restored later.

  • [x] Closes #3853 Maybe -- I'm not sure if it is clever to actually close this issue at this point. I added the backend_env idea.
  • [ ] Tests added

Here, I need some help: How should I actually design the tests? The environment is only temporarily modified, so as soon as the open_dataset function ends again, the environment is restored. I would have though temporarily adding an equivalent to $ export lala=tada into the environment and checking for that would be an idea; but since I restore the environment right away, there is no way I can see to actually access the new os.environ - [x] Passes isort -rc . && black . && mypy . && flake8 - [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

I added a section to the relevant docstring. Not sure how much this needs to also be included in the other files.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3858/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
315381649 MDU6SXNzdWUzMTUzODE2NDk= 2066 open_mfdataset can't handle many files pgierz 2444231 closed 0     7 2018-04-18T08:33:15Z 2019-03-18T14:58:15Z 2019-03-18T14:58:14Z NONE      

Code Sample, a copy-pastable example if possible

It appears as if the open_mfdataset cannot handle many files (many here = 1200)

```python ensemble = xr.open_mfdataset("/scratch/simulation_database/incoming/Eem125-S2/output/Eem125-S2_echam5_main_mm_26*.nc")

OSError Traceback (most recent call last) <ipython-input-4-038705c4f255> in <module>() ----> 1 ensemble = xr.open_mfdataset("/scratch/simulation_database/incoming/Eem125-S2/output/Eem125-S2_echam5_main_mm_26*.nc")

~/anaconda3/lib/python3.6/site-packages/xarray/backends/api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, **kwargs)

~/anaconda3/lib/python3.6/site-packages/xarray/backends/api.py in <listcomp>(.0)

~/anaconda3/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables)

~/anaconda3/lib/python3.6/site-packages/xarray/backends/netCDF4_.py in open(cls, filename, mode, format, group, writer, clobber, diskless, persist, autoclose)

~/anaconda3/lib/python3.6/site-packages/xarray/backends/netCDF4_.py in _open_netcdf4_group(filename, mode, group, **kwargs)

netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.init()

netCDF4/_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

OSError: [Errno 24] Too many open files: b'/scratch/simulation_database/incoming/Eem125-S2/output/Eem125-S2_echam5_main_mm_260001.nc' ```

Problem description

Often, climate simulations produce more than one output file per model component (generally 1 per saved time output, e.g. months, years, days, or something else). It would be good to access all of these as one object, rather than having to combining them by hand before with e.g. cdo or some other tool.

Expected Output

ensemble variable definition gives me back 1 object 😄

Output of xr.show_versions()

# Paste the output here xr.show_versions() here INSTALLED VERSIONS ------------------ commit: None python: 3.6.3.final.0 python-bits: 64 OS: Linux OS-release: 3.13.0-144-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0 pandas: 0.20.3 numpy: 1.13.3 scipy: 0.19.1 netCDF4: 1.3.1 h5netcdf: 0.5.0 Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.15.3 matplotlib: 2.1.0 cartopy: 0.16.0 seaborn: 0.8.0 setuptools: 36.5.0.post20170921 pip: 9.0.1 conda: 4.5.1 pytest: 3.2.1 IPython: 6.1.0 sphinx: 1.6.3
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2066/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 23.133ms · About: xarray-datasette