home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

18 rows where repo = 13221727, type = "pull" and user = 514053 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, comments, created_at (date), updated_at (date), closed_at (date)

type 1

  • pull · 18 ✖

state 1

  • closed 18

repo 1

  • xarray · 18 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
23272260 MDExOlB1bGxSZXF1ZXN0MTAyNzUzMTg= 2 Data objects now have a swappable backend store. akleeman 514053 closed 0     3 2013-11-25T20:48:40Z 2016-01-04T23:11:54Z 2014-01-29T19:20:58Z CONTRIBUTOR   0 pydata/xarray/pulls/2
  • Allows conversion to and from: NetCDF4, scipy.io.netcdf and in memory storage.
  • Added general test cases, and cases for specific backend stores.
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
59730888 MDExOlB1bGxSZXF1ZXN0MzA0MjcxMjU= 359 Raise informative exception when _FillValue and missing_value disagree akleeman 514053 closed 0   0.4.1 1004936 2 2015-03-04T00:22:41Z 2015-03-12T16:33:47Z 2015-03-12T16:32:07Z CONTRIBUTOR   0 pydata/xarray/pulls/359

Previously conflicting _FillValue and missing_value only raised an AssertionError, now it's more informative.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/359/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
58682523 MDExOlB1bGxSZXF1ZXN0Mjk4NjQ5NzA= 334 Fix bug associated with reading / writing of mixed endian data. akleeman 514053 closed 0   0.4 799013 1 2015-02-24T01:57:43Z 2015-02-26T04:45:18Z 2015-02-26T04:45:18Z CONTRIBUTOR   0 pydata/xarray/pulls/334

The right solution to this is to figure out how to successfully round trip endian-ness, but that seems to be a deeper issue inside netCDF4 (https://github.com/Unidata/netcdf4-python/issues/346)

Instead we force all data to little endian before netCDF4 write.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/334/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
54391570 MDExOlB1bGxSZXF1ZXN0MjczOTI5OTU= 310 More robust CF datetime unit parsing akleeman 514053 closed 0 shoyer 1217238 0.4 799013 1 2015-01-14T23:19:07Z 2015-01-14T23:36:34Z 2015-01-14T23:35:27Z CONTRIBUTOR   0 pydata/xarray/pulls/310

This makes it possible to read datasets that don't follow CF datetime conventions perfectly, such as the following example which (surprisingly) comes from NCEP/NCAR (you'd think they would follow CF!)

``` ds = xray.open_dataset('http://thredds.ucar.edu/thredds/dodsC/grib/NCEP/GEFS/Global_1p0deg_Ensemble/members/GEFS_Global_1p0deg_Ensemble_20150114_1200.grib2/GC') print ds['time'].encoding['units']

u'Hour since 2015-01-14T12:00:00Z' ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/310/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
36467304 MDExOlB1bGxSZXF1ZXN0MTc1ODI2ODQ= 175 Modular encoding akleeman 514053 closed 0     9 2014-06-25T10:37:41Z 2014-10-08T20:44:15Z 2014-10-08T20:44:15Z CONTRIBUTOR   0 pydata/xarray/pulls/175

Restructured Backends to make CF conventions handling consistent.

Among other things this includes: - EncodedDataStores which can wrap other stores and allow for modular encoding/decoding. - Trivial indices ds['x'] = ('x', np.arange(10)) are no longer stored on disk and are only created when accessed. - AbstractDataStore API change. Shouldn't effect external users. - missing_value attributes now function like _FillValue

All current tests are passing (though it could use more new ones).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/175/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
45289935 MDExOlB1bGxSZXF1ZXN0MjI0NTA0NzA= 248 Removed the object oriented encoding/decoding scheme akleeman 514053 closed 0     0 2014-10-08T20:06:33Z 2014-10-08T20:12:56Z 2014-10-08T20:12:56Z CONTRIBUTOR   0 pydata/xarray/pulls/248

Removed the object oriented encoding/decoding scheme in favor of a model where encoding/decoding happens when a dataset is stored to/ loaded from a DataStore.

Conventions can now be enforced at the DataStore level by overwriting the Datastore.store() and Datastore.load() methods, or as an optional arg to Dataset.load_store, Dataset.dump_to_store.

Includes miscellaneous cleanup.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/248/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
26545877 MDExOlB1bGxSZXF1ZXN0MTIwMDU3ODk= 8 Datasets now use data stores to allow swap-able backends akleeman 514053 closed 0     0 2014-01-29T19:25:42Z 2014-06-17T00:35:01Z 2014-01-29T19:30:09Z CONTRIBUTOR   0 pydata/xarray/pulls/8

``` Data objects now have a swap-able backend store.

  • Allows conversion to and from: NetCDF4, scipy.io.netcdf and in memory storage.
  • Added general test cases, and cases for specific backend stores.
  • Dataset.translate() can now optionally copy the object.
  • Fixed most unit tests, test_translate_consistency still fails. ```
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
35564268 MDExOlB1bGxSZXF1ZXN0MTcwNDU3Mjk= 153 Fix decode_cf_variable. akleeman 514053 closed 0     5 2014-06-12T09:42:47Z 2014-06-12T23:33:46Z 2014-06-12T23:33:46Z CONTRIBUTOR   0 pydata/xarray/pulls/153

decode_cf_variable was still using da.data instead of da.values.

It now also works with DataArray as input.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/153/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
35627287 MDExOlB1bGxSZXF1ZXN0MTcwODIzNjc= 154 Fix decode_cf_variable, without tests akleeman 514053 closed 0     0 2014-06-12T21:56:10Z 2014-06-12T23:30:15Z 2014-06-12T23:30:15Z CONTRIBUTOR   0 pydata/xarray/pulls/154

same as #153, but without tests.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/154/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
31510183 MDExOlB1bGxSZXF1ZXN0MTQ3NDQzOTI= 102 Dataset.concat() can now automatically concat over non-equal variables. akleeman 514053 closed 0     3 2014-04-14T22:19:02Z 2014-06-12T17:33:49Z 2014-04-23T03:24:45Z CONTRIBUTOR   0 pydata/xarray/pulls/102

concat_over=True indicates that concat should concat over all variables that are not the same in the set of datasets that are to be concatenated.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/102/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
28315331 MDExOlB1bGxSZXF1ZXN0MTI5NDE2MDI= 21 Cf time units persist akleeman 514053 closed 0     4 2014-02-26T08:05:41Z 2014-06-12T17:29:24Z 2014-02-28T01:45:21Z CONTRIBUTOR   0 pydata/xarray/pulls/21

Internally Datasets convert time coordinates to pandas.DatetimeIndex. The backend function convert_to_cf_variable will convert these datetimes back to CF style times, but the original units were not being preserved.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/21/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
30340123 MDExOlB1bGxSZXF1ZXN0MTQwODExMjk= 86 BUG: Zero dimensional variables couldn't be written to file or serialized. akleeman 514053 closed 0     0 2014-03-27T20:42:06Z 2014-06-12T17:29:11Z 2014-03-28T03:58:43Z CONTRIBUTOR   0 pydata/xarray/pulls/86

Fixed a bug in which writes would fail if Datasets contained 0d variables.

Also added the ability to open Datasets directly from NetCDF3 bytestrings.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/86/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
34649908 MDExOlB1bGxSZXF1ZXN0MTY1MzU0ODE= 143 Fix decoded_cf_variable was not working. akleeman 514053 closed 0     0 2014-05-30T14:27:13Z 2014-06-12T09:39:20Z 2014-06-12T09:39:20Z CONTRIBUTOR   0 pydata/xarray/pulls/143

Small bug fix, and a test.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/143/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
35304758 MDExOlB1bGxSZXF1ZXN0MTY4OTY2MjM= 150 Fix DecodedCFDatetimeArray was being incorrectly indexed. akleeman 514053 closed 0   0.2 650893 0 2014-06-09T17:25:05Z 2014-06-09T17:43:50Z 2014-06-09T17:43:50Z CONTRIBUTOR   0 pydata/xarray/pulls/150

This was causing an error in the following situation:

ds = xray.Dataset() ds['time'] = ('time', [np.datetime64('2001-05-01') for i in range(5)]) ds['variable'] = ('time', np.arange(5.)) ds.to_netcdf('test.nc') ds = xray.open_dataset('./test.nc') ss = ds.indexed(time=slice(0, 2)) ss.dumps()

Thanks @shoyer for the fix.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/150/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
33307883 MDExOlB1bGxSZXF1ZXN0MTU3NjcwMTU= 125 Only copy datetime64 data if it is using non-nanosecond precision. akleeman 514053 closed 0     7 2014-05-12T13:36:22Z 2014-05-20T19:09:40Z 2014-05-20T19:09:40Z CONTRIBUTOR   0 pydata/xarray/pulls/125

In an attempt to coerce all datetime arrays to nano second resolutoin utils.as_safe_array() was creating copies of any datetime64 array (via the astype method). This was causing unexpected behavior (bugs) for things such as concatenation over times. (see below).

``` import xray import pandas as pd

ds = xray.Dataset() ds['time'] = ('time', pd.date_range('2011-09-01', '2011-09-11')) times = [ds.indexed(time=[i]) for i in range(10)] ret = xray.Dataset.concat(times, 'time') print ret['time']

<xray.DataArray 'time' (time: 10)> array(['1970-01-02T07:04:40.718526408-0800', '1969-12-31T16:00:00.099966608-0800', '1969-12-31T16:00:00.041748384-0800', '1969-12-31T16:00:00.041748360-0800', '1969-12-31T16:00:00.041748336-0800', '1969-12-31T16:00:00.041748312-0800', '1969-12-31T16:00:00.041748288-0800', '1969-12-31T16:00:00.041748264-0800', '1969-12-31T16:00:00.041748240-0800', '1969-12-31T16:00:00.041748216-0800'], dtype='datetime64[ns]') Attributes: Empty ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/125/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
28603092 MDExOlB1bGxSZXF1ZXN0MTMxMDMwODQ= 40 Encodings for object data types are not saved. akleeman 514053 closed 0     0 2014-03-03T07:22:37Z 2014-04-09T04:10:56Z 2014-03-07T02:21:16Z CONTRIBUTOR   0 pydata/xarray/pulls/40

decode_cf_variable will not save encoding for any 'object' dtypes.

When encoding cf variables check if dtype is np.datetime64 as well as DatetimeIndex.

fixes akleeman/xray/issues/39

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/40/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
30328907 MDExOlB1bGxSZXF1ZXN0MTQwNzQzOTg= 84 Fix: dataset_repr was failing on empty datasets. akleeman 514053 closed 0     1 2014-03-27T18:29:18Z 2014-03-27T20:09:45Z 2014-03-27T20:05:49Z CONTRIBUTOR   0 pydata/xarray/pulls/84

BUG: dataset_repr was failing on empty datasets.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/84/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
28730473 MDExOlB1bGxSZXF1ZXN0MTMxNzU2NzY= 46 Test lazy loading from stores using mock XArray classes. akleeman 514053 closed 0     1 2014-03-04T18:50:40Z 2014-03-04T23:24:52Z 2014-03-04T23:10:28Z CONTRIBUTOR   0 pydata/xarray/pulls/46
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/46/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 72.768ms · About: xarray-datasette