home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

7 rows where user = 6153603 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 5
  • pull 2

state 1

  • closed 7

repo 1

  • xarray 7
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
374460958 MDU6SXNzdWUzNzQ0NjA5NTg= 2517 Treat accessor dataarrays as members of parent dataset czr137 6153603 closed 0     5 2018-10-26T16:37:45Z 2018-11-05T22:40:46Z 2018-11-05T22:40:46Z CONTRIBUTOR      

Code Sample

```python import xarray as xr import pandas as pd

What I'm doing with comparison, I'd like to do with actual

comparison = xr.Dataset({'data': (['time'], [100, 30, 10, 3, 1]), 'altitude': (['time'], [5, 10, 15, 20, 25])}, coords={'time': pd.date_range('2014-09-06', periods=5, freq='1s')})

With altitude as a data var, I can do the following:

comparison.swap_dims({'time': 'altitude'}).interp(altitude=12.0).data

And

for (time, g) in comparison.groupby('time'): print(time) print(g.altitude.values)

@xr.register_dataset_accessor('acc') class Accessor(object): def init(self, xarray_ds): self._ds = xarray_ds self._altitude = None

@property
def altitude(self):
    """ An expensive calculation that results in data that not everyone needs. """
    if self._altitude is None:
        self._altitude = xr.DataArray([5, 10, 15, 20, 25],
                                      coords=[('time', self._ds.time)])
    return self._altitude

actual = xr.Dataset({'data': (['time'], [100, 30, 10, 3, 1])}, coords={'time': pd.date_range('2014-09-06', periods=5, freq='1s')})

This doesn't work:

actual.swap_dims({'time': 'altitude'}).interp(altitude=12.0).data

Neither does this:

for (time, g) in actual.groupby('time'): print(time) print(g.acc.altitude.values)

```

Problem description

I've been using accessors to extend xarray with some custom computation. The altitude in the above dataset is not used every time the data is loaded, but when it is, it is an expensive computation to make (which is why I put it in as an accessor; if it isn't needed, it isn't computed).

Problem is, once it has been computed, I'd like to be able to use it as if it is a regular data_var of the dataset. For example, to interp on the newly computed column, or use it in a groupby.

Please advise if I'm going about this in the wrong way and how I should think about this problem instead.

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Linux OS-release: 4.18.16-arch1-1-ARCH machine: x86_64 processor: byteorder: little LC_ALL: None LANG: en_CA.UTF-8 LOCALE: en_CA.UTF-8 xarray: 0.10.8 pandas: 0.23.1 numpy: 1.14.5 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.5 distributed: 1.21.8 matplotlib: 2.2.2 cartopy: None seaborn: None setuptools: 39.2.0 pip: 9.0.3 conda: None pytest: 3.6.1 IPython: None sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2517/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
374473176 MDU6SXNzdWUzNzQ0NzMxNzY= 2518 Allow reduce to return an additional dimension czr137 6153603 closed 0     2 2018-10-26T17:15:29Z 2018-10-27T14:14:39Z 2018-10-27T14:14:39Z CONTRIBUTOR      

Code Sample, a copy-pastable example if possible

```python import xarray as xr from scipy.interpolate import interp1d

airtemps = xr.tutorial.load_dataset('air_temperature')

def interptotarget(y, axis, **kwargs): x = kwargs['x'] target = kwargs['target'] return interp1d(x, y)(target)

This works

airtemps.groupby('lat').reduce(interptotarget, dim='lon', x=airtemps.lon, target=213.5)

This doesn't, but I'd like it to:

airtemps.groupby('lat').reduce(interptotarget, dim='lon', x=airtemps.lon, target=[213.5, 213.6]) ```

Problem description

In the code above, I give an example of how I'd like to use a reduce to return an additional dimension that I will need to be defined.

The scipy call to interp1d has no trouble calculating the data, but xarray issues ValueError: dimensions ('time', 'lat') must have the same length as the number of data dimensions, ndim=3 as it doesn't know what to do with the additional dimension.

I used the above example as a generic case. I know I could use .interp to do an interpolation, but I have different reduction functions in mind that produce an additional dimension. The scipy interp1 function just serves as a working example.

Expected Output

a =airtemps.groupby('lat').reduce(interptotarget, dim='lon', x=airtemps.lon, target=213.5) b =airtemps.groupby('lat').reduce(interptotarget, dim='lon', x=airtemps.lon, target=213.6) import pandas as pd expected = xr.concat([a, b], pd.Index([213.5, 213.6], name='ilon'))

Output of xr.show_versions()

``` INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Linux OS-release: 4.18.16-arch1-1-ARCH machine: x86_64 processor: byteorder: little LC_ALL: None LANG: en_CA.UTF-8 LOCALE: en_CA.UTF-8 xarray: 0.10.8 pandas: 0.23.1 numpy: 1.14.5 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: 0.6.1 h5py: 2.8.0 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.5 distributed: 1.21.8 matplotlib: 2.2.2 cartopy: None seaborn: None setuptools: 39.2.0 pip: 9.0.3 conda: None pytest: 3.6.1 IPython: None sphinx: None ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2518/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
340757861 MDU6SXNzdWUzNDA3NTc4NjE= 2284 interp over time coordinate czr137 6153603 closed 0     2 2018-07-12T18:54:45Z 2018-07-29T06:09:41Z 2018-07-29T06:09:41Z CONTRIBUTOR      

Before I start, I'm very excited about the interp addition in 0.10.7. Great addition and thanks to @fujiisoup and @shoyer.

I see there was a bit of a discussion in the interp pull request, #2104, about interpolating over times and that it was suggested to wait for use cases. I can think of an immediate use case in my line of work. I frequently use regular gridded geophysical data (time, lat, lon), not unlike the sample tutorial air_temperature data, and the data must be interpolated to line up with corresponding satellite measurements that are irregularly spaced in lat, lon and time.

Being able to interpolate in one quick step would be fantastic. For example:

python ds = xr.tutorial.load_dataset('air_temperature') ds.interp(lat=60.5, lon=211, time='2013-01-01T03:14:37')

Problem description

Currently issues TypeError: cannot perform reduce with flexible type.

Desired Output

<xarray.Dataset> Dimensions: () Coordinates: lat float64 60.5 lon int64 211 time datetime64[ns] 2013-01-01T03:14:37 Data variables: air float64 273.5

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2284/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
292653302 MDExOlB1bGxSZXF1ZXN0MTY1ODg2Mzky 1869 Add '_FillValue' to set of valid_encodings for netCDF4 backend czr137 6153603 closed 0     9 2018-01-30T04:57:24Z 2018-02-13T18:34:44Z 2018-02-13T18:34:37Z CONTRIBUTOR   0 pydata/xarray/pulls/1869
  • [x] Closes #1865 (remove if there is no corresponding issue, which should only be the case for minor changes)
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
    • Relevant tests passed on my cloned version of the code. I had some skipped tests as I did not install rasterio, Nio, or Zarr. Four unrelated tests failed (2 in netcdf3 and 2 in h5netcdf)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1869/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
292585285 MDU6SXNzdWUyOTI1ODUyODU= 1865 Dimension Co-ordinates incorectly saving _FillValue attribute czr137 6153603 closed 0     6 2018-01-29T22:31:43Z 2018-02-13T18:34:36Z 2018-02-13T18:34:36Z CONTRIBUTOR      

Code Sample, a copy-pastable example if possible

```python import xarray as xr import pandas as pd import numpy as np

temp = 273.15 + 25 * np.random.randn(2, 2) lon = [0.0, 5.0] lat = [10.0, 20.0]

ds = xr.Dataset({'temperature': (['lat', 'lon'], temp)}, coords={'lat': lat, 'lon': lon})

ds['lat'].attrs = { 'standard_name': 'latitude', 'long_name': 'latitude', 'units': 'degrees_north', 'axis': 'Y'} ds['lon'].attrs = { 'standard_name': 'longitude', 'long_name': 'longitude', 'units': 'degrees_east', 'axis': 'X'} ds['temperature'].attrs = { 'standard_name': 'air_temperature', 'units': 'K'} ds.attrs = { ('title', 'non-conforming CF 1.6 data produced by xarray 0.10'), ('Conventions', 'CF-1.6')}

ds.to_netcdf('/tmp/test.nc')

```

Problem description

According to the last sentence of the first paragraph of 2.5.1. Missing data, valid and actual range of data in NetCDF Climate and Forecast (CF) Metadata Conventions 1.7:

Missing data is not allowed in coordinate variables.

When I use the conformance checker it issues an INFO message to this point for the co-ordinate variables.

Output of CF-Checker follows...

`` CHECKING NetCDF FILE: /tmp/29428.nc ===================== Using CF Checker Version 3.0.0 Checking against CF Version CF-1.6 Using Standard Name Table Version 48 (2017-11-28T15:32:48Z) Using Area Type Table Version 6 (22 February 2017)


Checking variable: temperature


Checking variable: lat

INFO: attribute _FillValue is being used in a non-standard way


Checking variable: lon

INFO: attribute _FillValue is being used in a non-standard way

ERRORS detected: 0 WARNINGS given: 0 INFORMATION messages: 2 ```

Expected Output

Co-ordinate variables should not store a _FillValue attribute

Output of xr.show_versions()

INSTALLED VERSIONS

commit: None python: 3.6.4.final.0 python-bits: 64 OS: Linux OS-release: 4.14.15-1-ARCH machine: x86_64 processor: byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.0 pandas: 0.22.0 numpy: 1.14.0 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: 0.5.0 Nio: None bottleneck: 1.2.1 cyordereddict: None dask: 0.16.1 matplotlib: 2.1.2 cartopy: None seaborn: None setuptools: 38.4.0 pip: 9.0.1 conda: None pytest: None IPython: None sphinx: None

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1865/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
216833414 MDU6SXNzdWUyMTY4MzM0MTQ= 1327 Add 'count' as option for how in dataset resample czr137 6153603 closed 0     2 2017-03-24T16:11:25Z 2018-02-13T18:05:57Z 2018-02-13T18:05:57Z CONTRIBUTOR      

All of the usual aggregations are included in resample, but 'count' is missing. At present, how='count' fails because the function doesn't accept the kwarg skipna that is being passed to it.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1327/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
216836795 MDExOlB1bGxSZXF1ZXN0MTEyNDkzMzQ4 1328 Add how=count option to resample czr137 6153603 closed 0     3 2017-03-24T16:23:32Z 2017-09-01T15:57:36Z 2017-09-01T15:57:35Z CONTRIBUTOR   0 pydata/xarray/pulls/1328
  • [x] closes #1327
  • [ ] tests added / passed
  • [ ] passes git diff upstream/master | flake8 --diff
  • [ ] whatsnew entry
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1328/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 28.466ms · About: xarray-datasette