home / github

Menu
  • Search all tables
  • GraphQL API

pull_requests

Table actions
  • GraphQL API for pull_requests

18 rows where user = 514053

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, base, created_at (date), updated_at (date), closed_at (date), merged_at (date)

id ▼ node_id number state locked title user body created_at updated_at closed_at merged_at merge_commit_sha assignee milestone draft head base author_association auto_merge repo url merged_by
10275318 MDExOlB1bGxSZXF1ZXN0MTAyNzUzMTg= 2 closed 0 Data objects now have a swappable backend store. akleeman 514053 - Allows conversion to and from: NetCDF4, scipy.io.netcdf and in memory storage. - Added general test cases, and cases for specific backend stores. 2013-11-25T20:48:40Z 2016-12-29T02:39:48Z 2014-01-29T19:20:58Z   5d8e6998d42efa29b62346b0b41b8a6eac27fb47     0 073f52281d55e4ed8c1999fcdcff7d4dba54cd76 eb971ee40161350e79e034cad5d1d9933b78f78d CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/2  
12005789 MDExOlB1bGxSZXF1ZXN0MTIwMDU3ODk= 8 closed 0 Datasets now use data stores to allow swap-able backends akleeman 514053 ``` Data objects now have a swap-able backend store. - Allows conversion to and from: NetCDF4, scipy.io.netcdf and in memory storage. - Added general test cases, and cases for specific backend stores. - Dataset.translate() can now optionally copy the object. - Fixed most unit tests, test_translate_consistency still fails. ``` 2014-01-29T19:25:42Z 2014-06-17T00:35:01Z 2014-01-29T19:30:09Z 2014-01-29T19:30:09Z 1f7bf07ce664cd4d1915956a459312bce9ef8505     0 58551773afcefb0cb32d24ced95602e6fc35b360 6b77d820851d9d9f6d4196c222d8ea75cdf26193 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/8  
12941602 MDExOlB1bGxSZXF1ZXN0MTI5NDE2MDI= 21 closed 0 Cf time units persist akleeman 514053 Internally Datasets convert time coordinates to pandas.DatetimeIndex. The backend function convert_to_cf_variable will convert these datetimes back to CF style times, but the original units were not being preserved. 2014-02-26T08:05:41Z 2014-06-12T17:29:24Z 2014-02-28T01:45:21Z   9b89321f4c39477abb64d09f7c3b238c6ff1c1ee     0 9b403acf84e38418d820b4dd658c865503e3076f 6167e0f3f8617534be0fcf43b9618bd82d431ef4 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/21  
13103084 MDExOlB1bGxSZXF1ZXN0MTMxMDMwODQ= 40 closed 0 Encodings for object data types are not saved. akleeman 514053 decode_cf_variable will not save encoding for any 'object' dtypes. When encoding cf variables check if dtype is np.datetime64 as well as DatetimeIndex. fixes akleeman/xray/issues/39 2014-03-03T07:22:37Z 2014-04-09T04:10:56Z 2014-03-07T02:21:16Z   7daf9d244f727247dd49a11171d3902ebbd5ef43     0 34b65e1af60b1740dd825b47ff80a0e50d0ade64 08a03b3c3a864ae0743623c67c66f72da8422d79 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/40  
13175676 MDExOlB1bGxSZXF1ZXN0MTMxNzU2NzY= 46 closed 0 Test lazy loading from stores using mock XArray classes. akleeman 514053   2014-03-04T18:50:40Z 2014-03-04T23:24:52Z 2014-03-04T23:10:28Z 2014-03-04T23:10:28Z 744cc1dfd2eb641e1677b93991de2fa15fa12b87     0 c002324efb2d1966ad33c21d960f3bfd6dabff90 63ea8c5f7a1792a086e85604b4f267684f299dd4 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/46  
14074398 MDExOlB1bGxSZXF1ZXN0MTQwNzQzOTg= 84 closed 0 Fix: dataset_repr was failing on empty datasets. akleeman 514053 BUG: dataset_repr was failing on empty datasets. 2014-03-27T18:29:18Z 2014-03-27T20:09:45Z 2014-03-27T20:05:49Z 2014-03-27T20:05:49Z 93e318a319e9ab6f5e1a8fa1e118131647709df6     0 68d5e7a0c7b35b9add4ecb6717036f7204118a93 648ce64176410ff0fb397ea7b0c13b41ae588183 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/84  
14081129 MDExOlB1bGxSZXF1ZXN0MTQwODExMjk= 86 closed 0 BUG: Zero dimensional variables couldn't be written to file or serialized. akleeman 514053 Fixed a bug in which writes would fail if Datasets contained 0d variables. Also added the ability to open Datasets directly from NetCDF3 bytestrings. 2014-03-27T20:42:06Z 2014-06-12T17:29:11Z 2014-03-28T03:58:43Z 2014-03-28T03:58:43Z 6e5ba34ac1e034a6c1aea276231548850994e21e     0 59acec9e9ee1def7df6bd570c110759a3760e7cb f41f7f0d2937239e695bcdadc697ca688c62bf67 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/86  
14744392 MDExOlB1bGxSZXF1ZXN0MTQ3NDQzOTI= 102 closed 0 Dataset.concat() can now automatically concat over non-equal variables. akleeman 514053 concat_over=True indicates that concat should concat over all variables that are not the same in the set of datasets that are to be concatenated. 2014-04-14T22:19:02Z 2014-06-12T17:33:49Z 2014-04-23T03:24:45Z 2014-04-23T03:24:45Z 881122397cf3728b58856cca2986078bfa49c038     0 b9635a53136126980080f4ff80e213c936a3c1e0 4713be2beef8c02818089da7c4d343669b59ff1b CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/102  
15767015 MDExOlB1bGxSZXF1ZXN0MTU3NjcwMTU= 125 closed 0 Only copy datetime64 data if it is using non-nanosecond precision. akleeman 514053 In an attempt to coerce all datetime arrays to nano second resolutoin utils.as_safe_array() was creating copies of any datetime64 array (via the astype method). This was causing unexpected behavior (bugs) for things such as concatenation over times. (see below). ``` import xray import pandas as pd ds = xray.Dataset() ds['time'] = ('time', pd.date_range('2011-09-01', '2011-09-11')) times = [ds.indexed(time=[i]) for i in range(10)] ret = xray.Dataset.concat(times, 'time') print ret['time'] <xray.DataArray 'time' (time: 10)> array(['1970-01-02T07:04:40.718526408-0800', '1969-12-31T16:00:00.099966608-0800', '1969-12-31T16:00:00.041748384-0800', '1969-12-31T16:00:00.041748360-0800', '1969-12-31T16:00:00.041748336-0800', '1969-12-31T16:00:00.041748312-0800', '1969-12-31T16:00:00.041748288-0800', '1969-12-31T16:00:00.041748264-0800', '1969-12-31T16:00:00.041748240-0800', '1969-12-31T16:00:00.041748216-0800'], dtype='datetime64[ns]') Attributes: Empty ``` 2014-05-12T13:36:22Z 2014-05-20T19:09:40Z 2014-05-20T19:09:40Z   e255f9e632bd646190ba6433599ccea7e122cc7f     0 d09708a119d8ca90298673ecd982414017ef53de 8f667bef6e190764cdd801fc857f94f23c8a36c2 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/125  
16535481 MDExOlB1bGxSZXF1ZXN0MTY1MzU0ODE= 143 closed 0 Fix decoded_cf_variable was not working. akleeman 514053 Small bug fix, and a test. 2014-05-30T14:27:13Z 2014-06-12T09:39:20Z 2014-06-12T09:39:20Z   b77a8173175acc504ccf1203576b7be4b111da6e     0 1ebd3a5df08605410d716a002de4e72072dbd7e8 71137d1e50116e5cca63d9b1c169844b5737cec2 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/143  
16896623 MDExOlB1bGxSZXF1ZXN0MTY4OTY2MjM= 150 closed 0 Fix DecodedCFDatetimeArray was being incorrectly indexed. akleeman 514053 This was causing an error in the following situation: ``` ds = xray.Dataset() ds['time'] = ('time', [np.datetime64('2001-05-01') for i in range(5)]) ds['variable'] = ('time', np.arange(5.)) ds.to_netcdf('test.nc') ds = xray.open_dataset('./test.nc') ss = ds.indexed(time=slice(0, 2)) ss.dumps() ``` Thanks @shoyer for the fix. 2014-06-09T17:25:05Z 2014-06-09T17:43:50Z 2014-06-09T17:43:50Z 2014-06-09T17:43:50Z 2ec8b7127f0d27683cb6d32da859a62e00ded6b9   0.2 650893 0 095e7070342a01ce5ee06a4cabd55087ad80395d 3af0e34b90b8ec5436047419ad3ed2402ad5ff24 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/150  
17045729 MDExOlB1bGxSZXF1ZXN0MTcwNDU3Mjk= 153 closed 0 Fix decode_cf_variable. akleeman 514053 decode_cf_variable was still using da.data instead of da.values. It now also works with DataArray as input. 2014-06-12T09:42:47Z 2014-06-12T23:33:46Z 2014-06-12T23:33:46Z   05f01af1d6dffe8e3f23024e56d75806e5979fe5     0 16f17204f3d16485bdba1e1988a17bd6ab570502 606f388df0173c81feddd595a6af8e0ac986e830 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/153  
17082367 MDExOlB1bGxSZXF1ZXN0MTcwODIzNjc= 154 closed 0 Fix decode_cf_variable, without tests akleeman 514053 same as #153, but without tests. 2014-06-12T21:56:10Z 2014-06-12T23:30:15Z 2014-06-12T23:30:15Z 2014-06-12T23:30:15Z ce73ec55da14eb79c986058bf34d766c8142037d     0 dffb5ecac0188baae98e87d7a926db22dd723960 606f388df0173c81feddd595a6af8e0ac986e830 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/154  
17582684 MDExOlB1bGxSZXF1ZXN0MTc1ODI2ODQ= 175 closed 0 Modular encoding akleeman 514053 Restructured Backends to make CF conventions handling consistent. Among other things this includes: - EncodedDataStores which can wrap other stores and allow for modular encoding/decoding. - Trivial indices ds['x'] = ('x', np.arange(10)) are no longer stored on disk and are only created when accessed. - AbstractDataStore API change. Shouldn't effect external users. - missing_value attributes now function like _FillValue All current tests are passing (though it could use more new ones). 2014-06-25T10:37:41Z 2014-10-08T20:44:15Z 2014-10-08T20:44:15Z   676a05aaa20fc74957bd029616ab21fb5b7c74e7     0 134d5ba9f6b44cb4298b374e91ce7da6a19fd8dc 7e0e7b1f2b3663c9fddb7b9f1767e4e7f744d19c CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/175  
22450470 MDExOlB1bGxSZXF1ZXN0MjI0NTA0NzA= 248 closed 0 Removed the object oriented encoding/decoding scheme akleeman 514053 Removed the object oriented encoding/decoding scheme in favor of a model where encoding/decoding happens when a dataset is stored to/ loaded from a DataStore. Conventions can now be enforced at the DataStore level by overwriting the Datastore.store() and Datastore.load() methods, or as an optional arg to Dataset.load_store, Dataset.dump_to_store. Includes miscellaneous cleanup. 2014-10-08T20:06:33Z 2014-10-08T20:12:56Z 2014-10-08T20:12:56Z 2014-10-08T20:12:56Z 92d2dcac92d2b121b29da6d68d01eaf12805853e     0 c2e46d3b216d6da143c8f3c066e0ea000dff6ad8 c185df55d9ccfdb62915c362a09bd724a73de2d4 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/248  
27392995 MDExOlB1bGxSZXF1ZXN0MjczOTI5OTU= 310 closed 0 More robust CF datetime unit parsing akleeman 514053 This makes it possible to read datasets that don't follow CF datetime conventions perfectly, such as the following example which (surprisingly) comes from NCEP/NCAR (you'd think they would follow CF!) ``` ds = xray.open_dataset('http://thredds.ucar.edu/thredds/dodsC/grib/NCEP/GEFS/Global_1p0deg_Ensemble/members/GEFS_Global_1p0deg_Ensemble_20150114_1200.grib2/GC') print ds['time'].encoding['units'] u'Hour since 2015-01-14T12:00:00Z' ``` 2015-01-14T23:19:07Z 2015-01-14T23:36:34Z 2015-01-14T23:35:27Z 2015-01-14T23:35:27Z 96f2c394961b37f6e1238539bb254259c543b8ff shoyer 1217238 0.4 799013 0 d5115cb1947b0679cc9998665a71f5d85e260623 4a4be4ace8f42dbc7c4ab016ab58a46812b37ad1 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/310  
29864970 MDExOlB1bGxSZXF1ZXN0Mjk4NjQ5NzA= 334 closed 0 Fix bug associated with reading / writing of mixed endian data. akleeman 514053 The right solution to this is to figure out how to successfully round trip endian-ness, but that seems to be a deeper issue inside netCDF4 (https://github.com/Unidata/netcdf4-python/issues/346) Instead we force all data to little endian before netCDF4 write. 2015-02-24T01:57:43Z 2015-02-26T04:45:18Z 2015-02-26T04:45:18Z   8634487c2196fc84708be8e49ce59213c7623dfc   0.4 799013 0 b2bec2f7a6e02b3994a8da47aa4845810baaf136 400317e9afbbfafacb3aea4ffd70e8790c936ee6 CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/334  
30427125 MDExOlB1bGxSZXF1ZXN0MzA0MjcxMjU= 359 closed 0 Raise informative exception when _FillValue and missing_value disagree akleeman 514053 Previously conflicting _FillValue and missing_value only raised an AssertionError, now it's more informative. 2015-03-04T00:22:41Z 2015-03-12T16:33:47Z 2015-03-12T16:32:07Z 2015-03-12T16:32:07Z f1dbff3d12aa2f67c70a210651c31a37b60d838b   0.4.1 1004936 0 ec35efd763419f71fcb81a91a70251e55146f0e9 7187bb9af9b2fffedb931dcaa3766b58e769a13e CONTRIBUTOR   xarray 13221727 https://github.com/pydata/xarray/pull/359  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [pull_requests] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [state] TEXT,
   [locked] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [body] TEXT,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [merged_at] TEXT,
   [merge_commit_sha] TEXT,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [draft] INTEGER,
   [head] TEXT,
   [base] TEXT,
   [author_association] TEXT,
   [auto_merge] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [url] TEXT,
   [merged_by] INTEGER REFERENCES [users]([id])
);
CREATE INDEX [idx_pull_requests_merged_by]
    ON [pull_requests] ([merged_by]);
CREATE INDEX [idx_pull_requests_repo]
    ON [pull_requests] ([repo]);
CREATE INDEX [idx_pull_requests_milestone]
    ON [pull_requests] ([milestone]);
CREATE INDEX [idx_pull_requests_assignee]
    ON [pull_requests] ([assignee]);
CREATE INDEX [idx_pull_requests_user]
    ON [pull_requests] ([user]);
Powered by Datasette · Queries took 240.163ms · About: xarray-datasette