home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

7 rows where repo = 13221727, state = "closed" and user = 3924836 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, closed_at, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 4
  • issue 3

state 1

  • closed · 7 ✖

repo 1

  • xarray · 7 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1639815690 PR_kwDOAMm_X85M2xOo 7671 Delete built-in rasterio backend scottyhq 3924836 closed 0     0 2023-03-24T17:55:50Z 2023-03-29T17:31:27Z 2023-03-29T17:31:26Z MEMBER   0 pydata/xarray/pulls/7671

Following up on #5808, which deprecated XR.open_rasterio in favor of rioxarray this PR finally removes open_rasterio() entirely 2 years later!

Closes #2314, closes #2535, closes #3489, closes #3776, closes #4655, closes #5207 (either outdated or ‘wontfix’ in favor of rioxarray)

Maybe closes:

3921

Discussion closure (mark answer as ‘use rioxarray discussions’?):

5840, #6485, #7327

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7671/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 2,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
811409317 MDU6SXNzdWU4MTE0MDkzMTc= 4925 OpenDAP Documentation Example failing with RunTimeError scottyhq 3924836 closed 0     9 2021-02-18T19:52:24Z 2023-01-17T18:44:00Z 2023-01-17T18:44:00Z MEMBER      

What happened: Tried to follow http://xarray.pydata.org/en/stable/io.html#opendap with @wycheng-uw and ran into a massive traceback ending with RuntimeError: NetCDF: file not found

What you expected to happen: Expecting to successfully plot data as illustrated in documentation

Minimal Complete Verifiable Example:

python remote_data = xr.open_dataset("http://iridl.ldeo.columbia.edu/SOURCES/.OSU/.PRISM/.monthly/dods", decode_times=False) tmax = remote_data["tmax"][:500, ::3, ::3] tmax[0].plot()

Anything else we need to know?:

After digging through unresolved opendap issues (https://github.com/Unidata/netcdf4-python/issues/755, https://github.com/pydata/xarray/issues/3466, https://github.com/pydata/xarray/issues/4353)

https://github.com/pydata/xarray/issues/3580 provided the key workaround that netCDF4>1.5.1 seems to cause issues. Pinning netCDF4==1.5.1 is our current workaround. Not sure if this is specific to http://iridl.ldeo.columbia.edu/ or OpenDAP endpoints more generally...

Environment:

Output of <tt>xr.show_versions() </tt> INSTALLED VERSIONS ------------------ commit: None python: 3.7.8 | packaged by conda-forge | (default, Jul 31 2020, 02:37:09) [Clang 10.0.1 ] python-bits: 64 OS: Darwin OS-release: 20.3.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.16.2 pandas: 1.2.2 numpy: 1.19.1 scipy: 1.5.3 netCDF4: 1.5.6 pydap: None h5netcdf: 0.8.1 h5py: 3.1.0 Nio: None zarr: None cftime: 1.4.1 nc_time_axis: None PseudoNetCDF: None rasterio: 1.1.7 cfgrib: None iris: None bottleneck: None dask: 2.30.0 distributed: 2.30.0 matplotlib: 3.3.4 cartopy: 0.18.0 seaborn: None numbagg: None pint: None setuptools: 49.6.0.post20210108 pip: 20.2.3 conda: installed pytest: 6.1.1 IPython: 7.20.0 sphinx: 3.2.1 None
Full Traceback ```pytb --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) <ipython-input-4-78116fdadd4b> in <module> ----> 1 tmax[0].plot() ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/plot/plot.py in __call__(self, **kwargs) 444 445 def __call__(self, **kwargs): --> 446 return plot(self._da, **kwargs) 447 448 # we can't use functools.wraps here since that also modifies the name / qualname ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/plot/plot.py in plot(darray, row, col, col_wrap, ax, hue, rtol, subplot_kws, **kwargs) 161 162 """ --> 163 darray = darray.squeeze().compute() 164 165 plot_dims = set(darray.dims) ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/dataarray.py in compute(self, **kwargs) 891 """ 892 new = self.copy(deep=False) --> 893 return new.load(**kwargs) 894 895 def persist(self, **kwargs) -> "DataArray": ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/dataarray.py in load(self, **kwargs) 865 dask.array.compute 866 """ --> 867 ds = self._to_temp_dataset().load(**kwargs) 868 new = self._from_temp_dataset(ds) 869 self._variable = new._variable ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/dataset.py in load(self, **kwargs) 747 for k, v in self.variables.items(): 748 if k not in lazy_data: --> 749 v.load() 750 751 return self ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/variable.py in load(self, **kwargs) 437 self._data = as_compatible_data(self._data.compute(**kwargs)) 438 elif not is_duck_array(self._data): --> 439 self._data = np.asarray(self._data) 440 return self 441 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in __array__(self, dtype) 691 692 def __array__(self, dtype=None): --> 693 self._ensure_cached() 694 return np.asarray(self.array, dtype=dtype) 695 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in _ensure_cached(self) 688 def _ensure_cached(self): 689 if not isinstance(self.array, NumpyIndexingAdapter): --> 690 self.array = NumpyIndexingAdapter(np.asarray(self.array)) 691 692 def __array__(self, dtype=None): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in __array__(self, dtype) 661 662 def __array__(self, dtype=None): --> 663 return np.asarray(self.array, dtype=dtype) 664 665 def __getitem__(self, key): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in __array__(self, dtype) 566 def __array__(self, dtype=None): 567 array = as_indexable(self.array) --> 568 return np.asarray(array[self.key], dtype=None) 569 570 def transpose(self, order): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/coding/variables.py in __array__(self, dtype) 68 69 def __array__(self, dtype=None): ---> 70 return self.func(self.array) 71 72 def __repr__(self): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/coding/variables.py in _scale_offset_decoding(data, scale_factor, add_offset, dtype) 216 217 def _scale_offset_decoding(data, scale_factor, add_offset, dtype): --> 218 data = np.array(data, dtype=dtype, copy=True) 219 if scale_factor is not None: 220 data *= scale_factor ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/coding/variables.py in __array__(self, dtype) 68 69 def __array__(self, dtype=None): ---> 70 return self.func(self.array) 71 72 def __repr__(self): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/coding/variables.py in _apply_mask(data, encoded_fill_values, decoded_fill_value, dtype) 136 ) -> np.ndarray: 137 """Mask all matching values in a NumPy arrays.""" --> 138 data = np.asarray(data, dtype=dtype) 139 condition = False 140 for fv in encoded_fill_values: ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/numpy/core/_asarray.py in asarray(a, dtype, order) 81 82 """ ---> 83 return array(a, dtype, copy=False, order=order) 84 85 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in __array__(self, dtype) 566 def __array__(self, dtype=None): 567 array = as_indexable(self.array) --> 568 return np.asarray(array[self.key], dtype=None) 569 570 def transpose(self, order): ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/backends/netCDF4_.py in __getitem__(self, key) 71 def __getitem__(self, key): 72 return indexing.explicit_indexing_adapter( ---> 73 key, self.shape, indexing.IndexingSupport.OUTER, self._getitem 74 ) 75 ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/core/indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method) 851 """ 852 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support) --> 853 result = raw_indexing_method(raw_key.tuple) 854 if numpy_indices.tuple: 855 # index the loaded np.ndarray ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/backends/netCDF4_.py in _getitem(self, key) 83 with self.datastore.lock: 84 original_array = self.get_array(needs_lock=False) ---> 85 array = getitem(original_array, key) 86 except IndexError: 87 # Catch IndexError in netCDF4 and return a more informative ~/miniconda3/envs/intake-stac-gui/lib/python3.7/site-packages/xarray/backends/common.py in robust_getitem(array, key, catch, max_retries, initial_delay) 50 for n in range(max_retries + 1): 51 try: ---> 52 return array[key] 53 except catch: 54 if n == max_retries: src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.__getitem__() src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable._get() src/netCDF4/_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success() RuntimeError: NetCDF: file not found ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4925/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
473142248 MDExOlB1bGxSZXF1ZXN0MzAxMzY4NjIw 3162 changed url for rasterio network test scottyhq 3924836 closed 0     10 2019-07-26T02:06:20Z 2019-07-31T00:28:53Z 2019-07-31T00:28:46Z MEMBER   0 pydata/xarray/pulls/3162

fix failing rasterio network test by simplifying test and using same image url as rasterio library test suite - [x ] Closes #3083

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3162/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
387123860 MDExOlB1bGxSZXF1ZXN0MjM1NjgyMjk4 2589 added some logic to deal with rasterio objects in addition to filepaths scottyhq 3924836 closed 0     14 2018-12-04T05:13:33Z 2019-07-05T23:13:49Z 2018-12-23T19:02:53Z MEMBER   0 pydata/xarray/pulls/2589

…h strings

  • [x] Closes #2588
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2589/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
412645481 MDExOlB1bGxSZXF1ZXN0MjU0ODA3NTgz 2782 enable loading remote hdf5 files scottyhq 3924836 closed 0     9 2019-02-20T21:51:02Z 2019-03-16T00:36:12Z 2019-03-16T00:35:58Z MEMBER   0 pydata/xarray/pulls/2782

Enable loading remote hdf5 files. Will require h5py>2.9.0 and some changes to https://github.com/shoyer/h5netcdf. I've current just made a quick hack change to backends/api.py, so further tests are needed. Pinging @jhamman, @mrocklin, and @rabernat for thoughts on this.

Here is a short notebook demonstrating how this works: https://gist.github.com/scottyhq/790bf19c7811b5c6243ce37aae252ca1

  • [x] Closes #2781
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2782/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
412623833 MDU6SXNzdWU0MTI2MjM4MzM= 2781 enable reading of file-like HDF5 objects scottyhq 3924836 closed 0     2 2019-02-20T20:55:15Z 2019-03-16T00:35:57Z 2019-03-16T00:35:57Z MEMBER      

xarray 11.3 currently won't read HDF5 file-like objects

```python import xarray as xr import gcsfs fs = gcsfs.GCSFileSystem() images = fs.ls('pangeo-data/grfn-v2/137/') fileObj = fs.open('pangeo-data/grfn-v2/137/S1-GUNW-A-R-137-tops-20181129_20181123-020010-43220N_41518N-PP-e2c7-v2_0_0.nc')

but, can we open this w/ xarray anyway? Yes! with modifications to xarray and h5netcdf

da = xr.open_dataset(fileObj, group='/science/grids/data', engine='h5netcdf') da ```

```pytb

ValueError Traceback (most recent call last) <ipython-input-3-22e0010de1f2> in <module>() 1 # but, can we open this w/ xarray anyway? Yes! with modifications to xarray and h5netcdf ----> 2 da = xr.open_dataset(fileObj, group='/science/grids/data', engine='h5netcdf') 3 da

/srv/conda/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 347 else: 348 if engine is not None and engine != 'scipy': --> 349 raise ValueError('can only read file-like objects with ' 350 "default engine or engine='scipy'") 351 # assume filename_or_obj is a file-like object

ValueError: can only read file-like objects with default engine or engine='scipy' ```

Problem description

It is now possible to do this with h5py >2.9.0. see https://github.com/h5py/h5py/pull/1105. This would be a useful feature because there is a lot of NASA data out there in HDF5. This functionality could open up reading without first writing to disk (to translate to Zarr or other formats possibly). There seem to be many issues related to this: https://github.com/dask/s3fs/issues/144 https://github.com/pydata/xarray/issues/2535

I'm guessing adding this functionality doesn't fix many of the performance issues related to HDF5 and Dask https://github.com/dask/dask/issues/2488 https://github.com/dask/distributed/issues/2319

Expected Output

<xarray.Dataset> Dimensions: (latitude: 2045, longitude: 4158) Coordinates: * longitude (longitude) float64 -123.1 -123.1 ... -119.6 -119.6 * latitude (latitude) float64 43.22 43.22 43.22 ... 41.52 41.52 Data variables: crs int32 ... unwrappedPhase (latitude, longitude) float32 ... coherence (latitude, longitude) float32 ... connectedComponents (latitude, longitude) float32 ... amplitude (latitude, longitude) float32 ...

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.7 | packaged by conda-forge | (default, Nov 21 2018, 03:09:43) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.14.65+ machine: x86_64 processor: x86_64 byteorder: little LC_ALL: en_US.UTF-8 LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.2 xarray: 0.11.3 pandas: 0.24.1 numpy: 1.16.1 scipy: 1.2.0 netCDF4: 1.4.2 pydap: None h5netcdf: 0.6.2 h5py: 2.9.0 Nio: None zarr: 2.2.0 cftime: 1.0.3.4 PseudonetCDF: None rasterio: 1.0.18 cfgrib: None iris: None bottleneck: None cyordereddict: None dask: 1.1.0 distributed: 1.25.2 matplotlib: 3.0.2 cartopy: 0.17.0 seaborn: 0.9.0 setuptools: 40.7.1 pip: 19.0.2 conda: 4.6.3 pytest: None IPython: 7.1.1 sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2781/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
387123433 MDU6SXNzdWUzODcxMjM0MzM= 2588 Enabling rasterio.vrt.WarpedVRT with xr.open_rasterio scottyhq 3924836 closed 0     0 2018-12-04T05:11:12Z 2018-12-23T19:02:53Z 2018-12-23T19:02:53Z MEMBER      

This is not a bug, but rather a feature request and discussion opener for changes to the open_rasterio function

Currently open_rasterio (xarray version 0.11) only accepts filepath strings and does not work with In-memory rasterio.vrt.WarpedVRT objects. I have a solution (see pull request), but it's likely not the best one (Many context managers can feel odd), see example below:

```python

Lazy in-memory warping from UTM to WGS84 lat/lon

with env: with rasterio.open(url) as src: da = xr.open_rasterio(src) print(da.crs, da.sizes) with WarpedVRT(src, crs='epsg:4326') as vrt: with xr.open_rasterio(vrt) as da: print(da.crs, da.sizes)

+init=epsg:32610 (1, 7531, 7751)

+init=epsg:4326 +no_defs (1, 5981, 9183)

``` More detailed gist here: https://gist.github.com/scottyhq/ae90084adaf25e3b361b096d555c45f1

Problem description

In-memory “virtual” reprojection is a key feature of rasterio (see https://gist.github.com/sgillies/7e5cd548110a5b4d45ac1a1d93cb17a3), and it would be fantastic if this worked w/ xarray and dask distributed. Many workflows require warping between WGS84 lat/lon, UTM, Google Mercator, and rasterio can handle most any other projection.

related to: #1575, #2042, #2288 https://github.com/dask/dask/issues/3255 Seems like some synergy here with geoxarray and salem @mrocklin, @fmaussion, @geoxarray

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2588/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 169.737ms · About: xarray-datasette