home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

572 rows where type = "pull" and user = 1217238 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, comments, draft, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 569
  • open 3

type 1

  • pull · 572 ✖

repo 1

  • xarray 572
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
707647715 MDExOlB1bGxSZXF1ZXN0NDkyMDEzODg4 4453 Simplify and restore old behavior for deep-copies shoyer 1217238 closed 0     3 2020-09-23T20:10:33Z 2023-09-14T03:06:34Z 2023-09-14T03:06:33Z MEMBER   1 pydata/xarray/pulls/4453

Intended to fix https://github.com/pydata/xarray/issues/4449

The goal is to restore behavior to match what we had prior to https://github.com/pydata/xarray/pull/4379 for all types of data other than np.ndarray objects

Needs tests!

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes isort . && black . && mypy . && flake8
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4453/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
342928718 MDExOlB1bGxSZXF1ZXN0MjAyNzE0MjUx 2302 WIP: lazy=True in apply_ufunc() shoyer 1217238 open 0     1 2018-07-20T00:01:21Z 2023-07-18T04:19:17Z   MEMBER   0 pydata/xarray/pulls/2302
  • [x] Closes https://github.com/pydata/xarray/issues/2298
  • [ ] Tests added
  • [ ] Tests passed
  • [ ] Fully documented, including whats-new.rst for all changes and api.rst for new API

Still needs more tests and documentation.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2302/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1767947798 PR_kwDOAMm_X85TkPzV 7933 Update calendar for developers meeting shoyer 1217238 closed 0     0 2023-06-21T16:09:44Z 2023-06-21T17:56:22Z 2023-06-21T17:56:22Z MEMBER   0 pydata/xarray/pulls/7933

The old calendar was on @jhamman's UCAR account, which he no longer has access to!

xref https://github.com/pydata/xarray/issues/4001

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7933/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
895983112 MDExOlB1bGxSZXF1ZXN0NjQ4MTM1NTcy 5351 Add xarray.backends.NoMatchingEngineError shoyer 1217238 open 0     4 2021-05-19T22:09:21Z 2022-11-16T15:19:54Z   MEMBER   0 pydata/xarray/pulls/5351
  • [x] Closes #5329
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5351/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
803068773 MDExOlB1bGxSZXF1ZXN0NTY5MDU5MTEz 4879 Cache files for different CachingFileManager objects separately shoyer 1217238 closed 0     10 2021-02-07T21:48:06Z 2022-10-18T16:40:41Z 2022-10-18T16:40:40Z MEMBER   0 pydata/xarray/pulls/4879

This means that explicitly opening a file multiple times with open_dataset (e.g., after modifying it on disk) now reopens the file from scratch, rather than reusing a cached version.

If users want to reuse the cached file, they can reuse the same xarray object. We don't need this for handling many files in Dask (the original motivation for caching), because in those cases only a single CachingFileManager is created.

I think this should some long-standing usability issues: #4240, #4862

Conveniently, this also obviates the need for some messy reference counting logic.

  • [x] Closes #4240, #4862
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4879/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
168272291 MDExOlB1bGxSZXF1ZXN0NzkzMjE2NTc= 924 WIP: progress toward making groupby work with multiple arguments shoyer 1217238 open 0     16 2016-07-29T08:07:57Z 2022-06-09T14:50:17Z   MEMBER   0 pydata/xarray/pulls/924

Fixes #324

It definitely doesn't work properly yet, totally mixing up coordinates, data variables and multi-indexes (as shown by the failing tests).

A simple example:

``` In [4]: coords = {'a': ('x', [0, 0, 1, 1]), 'b': ('y', [0, 0, 1, 1])}

In [5]: square = xr.DataArray(np.arange(16).reshape(4, 4), coords=coords, dims=['x', 'y'])

In [6]: square Out[6]: <xarray.DataArray (x: 4, y: 4)> array([[ 0, 1, 2, 3], [ 4, 5, 6, 7], [ 8, 9, 10, 11], [12, 13, 14, 15]]) Coordinates: b (y) int64 0 0 1 1 a (x) int64 0 0 1 1 * x (x) int64 0 1 2 3 * y (y) int64 0 1 2 3

In [7]: square.groupby(['a', 'b']).mean() Out[7]: <xarray.DataArray (a: 2, b: 2)> array([[ 2.5, 4.5], [ 10.5, 12.5]]) Coordinates: * a (a) int64 0 1 * b (b) int64 0 1

In [8]: square.groupby(['x', 'y']).mean() Out[8]: <xarray.DataArray (x: 4, y: 4)> array([[ 0., 1., 2., 3.], [ 4., 5., 6., 7.], [ 8., 9., 10., 11.], [ 12., 13., 14., 15.]]) Coordinates: * x (x) int64 0 1 2 3 * y (y) int64 0 1 2 3 ```

More examples: https://gist.github.com/shoyer/5cfa4d5751e8a78a14af25f8442ad8d5

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/924/reactions",
    "total_count": 4,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 3,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
327166000 MDExOlB1bGxSZXF1ZXN0MTkxMDMwMjA4 2195 WIP: explicit indexes shoyer 1217238 closed 0     3 2018-05-29T04:25:15Z 2022-03-21T14:59:52Z 2022-03-21T14:59:52Z MEMBER   0 pydata/xarray/pulls/2195

Some utility functions that should be useful for https://github.com/pydata/xarray/issues/1603

Still very much a work in progress -- it would be great if someone has time to finish writing any of these in another PR!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2195/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1062709354 PR_kwDOAMm_X84u-sO9 6025 Simplify missing value handling in xarray.corr shoyer 1217238 closed 0     1 2021-11-24T17:48:03Z 2021-11-28T04:39:22Z 2021-11-28T04:39:22Z MEMBER   0 pydata/xarray/pulls/6025

This PR simplifies the fix from https://github.com/pydata/xarray/pull/5731, specifically for the benefit of xarray.corr. There is no need to use map_blocks instead of using where directly.

It is a basically an alternative version of https://github.com/pydata/xarray/pull/5284. It is potentially slightly less efficient to do this masking step when unnecessary, but I doubt this makes a noticeable performance difference in practice (and I doubt this optimization is useful insdie map_blocks, anyways).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6025/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1044151556 PR_kwDOAMm_X84uELYB 5935 Docs: fix URL for PTSA shoyer 1217238 closed 0     1 2021-11-03T21:56:44Z 2021-11-05T09:36:04Z 2021-11-05T09:36:04Z MEMBER   0 pydata/xarray/pulls/5935

One of the PTSA authors told me about the new URL by email.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5935/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
948890466 MDExOlB1bGxSZXF1ZXN0NjkzNjY1NDEy 5624 Make typing-extensions optional shoyer 1217238 closed 0     6 2021-07-20T17:43:22Z 2021-07-22T23:30:49Z 2021-07-22T23:02:03Z MEMBER   0 pydata/xarray/pulls/5624

Type checking may be a little worse if typing-extensions are not installed, but I don't think it's worth the trouble of adding another hard dependency just for one use for TypeGuard.

Note: sadly this doesn't work yet. Mypy (and pylance) don't like the type alias defined with try/except. Any ideas? In the worst case, we could revert the TypeGuard entirely, but that would be a shame...

  • [x] Closes #5495
  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5624/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
874331538 MDExOlB1bGxSZXF1ZXN0NjI4OTE0NDQz 5252 Add mode="r+" for to_zarr and use consolidated writes/reads by default shoyer 1217238 closed 0     14 2021-05-03T07:57:16Z 2021-06-22T06:51:35Z 2021-06-17T17:19:26Z MEMBER   0 pydata/xarray/pulls/5252

mode="r+" only allows for modifying pre-existing array values in a Zarr store. This makes it a safer default mode when doing a limited region write. It also offers a nice performance bonus when using consolidated metadata, because the store to modify can be opened in "consolidated" mode -- rather than painfully slow non-consolidated mode.

This PR includes several related changes to to_zarr():

  1. It adds support for the new mode="r+".
  2. consolidated=True in to_zarr() now means "open in consolidated mode" if using using mode="r+", instead of "write in consolidated mode" (which would not make sense for r+).
  3. It allows setting consolidated=True when using region, mostly for the sake of fast store opening with r+.
  4. Validation in to_zarr() has been reorganized to always use the existing Zarr group, rather than re-opening zar stores from scratch, which could require additional network requests.
  5. Incidentally, I've renamed the ZarrStore.ds attribute to ZarrStore.zarr_group, which is a much more descriptive name.

These changes gave me a ~5x boost in write performance in a large parallel job making use of to_zarr with region.

  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5252/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
891253662 MDExOlB1bGxSZXF1ZXN0NjQ0MTQ5Mzc2 5300 Better error message when no backend engine is found. shoyer 1217238 closed 0     4 2021-05-13T18:10:04Z 2021-05-18T21:23:00Z 2021-05-18T21:23:00Z MEMBER   0 pydata/xarray/pulls/5300

Also includes a better error message when loading a tutorial dataset but an underlying IO dependency is not found.

  • [x] Fixes #5291
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5300/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
890573049 MDExOlB1bGxSZXF1ZXN0NjQzNTc1Mjc5 5296 More robust guess_can_open for netCDF4/scipy/h5netcdf entrypoints shoyer 1217238 closed 0     1 2021-05-12T23:53:32Z 2021-05-14T22:40:14Z 2021-05-14T22:40:14Z MEMBER   0 pydata/xarray/pulls/5296

The new version checks magic numbers in files on disk, not just already open file objects.

I've also added a bunch of unit-tests.

Fixes GH5295

  • [x] Closes #5295
  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5296/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
645062817 MDExOlB1bGxSZXF1ZXN0NDM5NTg4OTU1 4178 Fix min_deps_check; revert to support numpy=1.14 and pandas=0.24 shoyer 1217238 closed 0     5 2020-06-25T00:37:19Z 2021-02-27T21:46:43Z 2021-02-27T21:46:42Z MEMBER   1 pydata/xarray/pulls/4178

Fixes the issue noticed in: https://github.com/pydata/xarray/pull/4175#issuecomment-649135372

Let's see if this passes CI...

  • [x] Passes isort -rc . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4178/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
613012939 MDExOlB1bGxSZXF1ZXN0NDEzODQ3NzU0 4035 Support parallel writes to regions of zarr stores shoyer 1217238 closed 0     17 2020-05-06T02:40:19Z 2020-11-04T06:19:01Z 2020-11-04T06:19:01Z MEMBER   0 pydata/xarray/pulls/4035

This PR adds support for a region keyword argument to to_zarr(), to support parallel writes to different parts of arrays in a zarr stores, e.g., ds.to_zarr(..., region={'x': slice(1000, 2000)}) to write a dataset over the range 1000:2000 along the x dimension.

This is useful for creating large Zarr datasets without requiring dask. For example, the separate workers in a simulation job might each write a single non-overlapping chunk of a Zarr file. The standard way to handle such datasets today is to first write netCDF files in each process, and then consolidate them afterwards with dask (see #3096).

Creating empty Zarr stores

In order to do so, the Zarr file must be pre-existing with desired variables in the right shapes/chunks. It is desirable to be able to create such stores without actually writing data, because datasets that we want to write in parallel may be very large.

In the example below, I achieve this filling a Dataset with dask arrays, and passing compute=False to to_zarr(). This works, but it relies on an undocumented implementation detail of the compute argument. We should either:

  1. Officially document that the compute argument only controls writing array values, not metadata (at least for zarr).
  2. Add a new keyword argument or entire new method for creating an unfilled Zarr store, e.g., write_values=False.

I think (1) is maybe the cleanest option (no extra API endpoints).

Unchunked variables

One potential gotcha concerns coordinate arrays that are not chunked, e.g., consider parallel writing of a dataset divided along time with 2D latitude and longitude arrays that are fixed over all chunks. With the current PR, such coordinate arrays would get rewritten by each separate writer.

If a Zarr store does not have atomic writes, then conceivably this could result in corrupted data. The default DirectoryStore has atomic writes and cloud based object stores should also be atomic, so perhaps this doesn't matter in practice, but at the very least it's inefficient and could cause issues for large-scale jobs due to resource contention.

Options include:

  1. Current behavior. Variables whose dimensions do not overlap with region are written by to_zarr(). This is likely the most intuitive behavior for writing from a single process at a time.
  2. Exclude variables whose dimensions do not overlap with region from being written. This is likely the most convenient behavior for writing from multiple processes at once.
  3. Like (2), but issue a warning if any such variables exist instead of silently dropping them.
  4. Like (2), but raise an error instead of a warning. Require the user to explicitly drop them with .drop(). This is probably the safest behavior.

I think (4) would be my preferred option. Some users would undoubtedly find this annoying, but the power-users for whom we are adding this feature would likely appreciate it.

Usage example

```python import xarray import dask.array as da

ds = xarray.Dataset({'u': (('x',), da.arange(1000, chunks=100))})

create the new zarr store, but don't write data

path = 'my-data.zarr' ds.to_zarr(path, compute=False)

look at the unwritten data

ds_opened = xarray.open_zarr(path) print('Data before writing:', ds_opened.u.data[::100].compute())

Data before writing: [ 1 100 1 100 100 1 1 1 1 1]

write out each slice (could be in separate processes)

for start in range(0, 1000, 100): selection = {'x': slice(start, start + 100)} ds.isel(selection).to_zarr(path, region=selection)

print('Data after writing:', ds_opened.u.data[::100].compute())

Data after writing: [ 0 100 200 300 400 500 600 700 800 900]

```

  • [x] Closes https://github.com/pydata/xarray/issues/3096
  • [x] Integration test
  • [x] Unit tests
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4035/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
718492237 MDExOlB1bGxSZXF1ZXN0NTAwODc5MTY3 4500 Add variable/attribute names to netCDF validation errors shoyer 1217238 closed 0     1 2020-10-10T00:47:18Z 2020-10-10T05:28:08Z 2020-10-10T05:28:08Z MEMBER   0 pydata/xarray/pulls/4500

This should result in a better user experience, e.g., specifically pointing out the attribute with an invalid value.

  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4500/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
702372014 MDExOlB1bGxSZXF1ZXN0NDg3NjYxMzIz 4426 Fix for h5py deepcopy issues shoyer 1217238 closed 0     6 2020-09-16T01:11:00Z 2020-09-18T22:31:13Z 2020-09-18T22:31:09Z MEMBER   0 pydata/xarray/pulls/4426
  • [x] Closes #4425
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4426/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
669307837 MDExOlB1bGxSZXF1ZXN0NDU5Njk1NDA5 4292 Fix indexing with datetime64[ns] with pandas=1.1 shoyer 1217238 closed 0     11 2020-07-31T00:48:50Z 2020-09-16T03:11:48Z 2020-09-16T01:33:30Z MEMBER   0 pydata/xarray/pulls/4292

Fixes #4283

The underlying issue is that calling .item() on a NumPy array with dtype=datetime64[ns] returns an integer, rather than an np.datetime64 scalar. This is somewhat baffling but works this way because .item() returns native Python types, but datetime.datetime doesn't support nanosecond precision.

pandas.Index.get_loc used to support these integers, but now is more strict. Hence we get errors.

We can fix this by using array[()] to convert 0d arrays into NumPy scalars instead of calling array.item().

I've added a crude regression test. There may well be a better way to test this but I haven't figured it out yet.

  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4292/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
638597800 MDExOlB1bGxSZXF1ZXN0NDM0MzMxNzQ3 4154 Update issue templates inspired/based on dask shoyer 1217238 closed 0     1 2020-06-15T07:00:53Z 2020-08-05T13:05:33Z 2020-06-17T16:50:57Z MEMBER   0 pydata/xarray/pulls/4154

See https://github.com/dask/dask/issues/new/choose for an approximate example of what this looks like.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4154/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
646073396 MDExOlB1bGxSZXF1ZXN0NDQwNDMxNjk5 4184 Improve the speed of from_dataframe with a MultiIndex (by 40x!) shoyer 1217238 closed 0     1 2020-06-26T07:39:14Z 2020-07-02T20:39:02Z 2020-07-02T20:39:02Z MEMBER   0 pydata/xarray/pulls/4184

Before:

pandas.MultiIndexSeries.time_to_xarray
======= ========= ==========
--             subset
------- --------------------
dtype     True     False
======= ========= ==========
  int    505±0ms   37.1±0ms
 float   485±0ms   38.3±0ms
======= ========= ==========

After:

pandas.MultiIndexSeries.time_to_xarray
======= ============ ==========
--               subset
------- -----------------------
dtype      True       False
======= ============ ==========
  int    10.7±0.4ms   22.6±1ms
 float   10.0±0.8ms   21.1±1ms
======= ============ ==========

~~There are still some cases where we have to fall back to the existing slow implementation, but hopefully they should now be relatively rare.~~ Edit: now we always use the new implementation

  • [x] Closes #2459, closes #4186
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] Passes isort -rc . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4184/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 1,
    "eyes": 0
}
    xarray 13221727 pull
645961347 MDExOlB1bGxSZXF1ZXN0NDQwMzQ2NTQz 4182 Show data by default in HTML repr for DataArray shoyer 1217238 closed 0     0 2020-06-26T02:25:08Z 2020-06-28T17:03:41Z 2020-06-28T17:03:41Z MEMBER   0 pydata/xarray/pulls/4182
  • [x] Closes #4176
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4182/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
644170008 MDExOlB1bGxSZXF1ZXN0NDM4ODQxMjk2 4171 Remove <pre> from nested HTML repr shoyer 1217238 closed 0     0 2020-06-23T21:51:14Z 2020-06-24T15:45:20Z 2020-06-24T15:45:00Z MEMBER   0 pydata/xarray/pulls/4171

Using <pre> messes up the display of nested HTML reprs, e.g., from dask. Now we only use the <pre> tag when displaying raw text reprs.

Before (Jupyter notebook):

After:

  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4171/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
613546626 MDExOlB1bGxSZXF1ZXN0NDE0MjgwMDEz 4039 Revise pull request template shoyer 1217238 closed 0     5 2020-05-06T19:08:19Z 2020-06-18T05:45:11Z 2020-06-18T05:45:10Z MEMBER   0 pydata/xarray/pulls/4039

See below for the new language, to clarify that documentation is only necessary for "user visible changes."

I added "including notable bug fixes" to indicate that minor bug fixes may not be worth noting (I was thinking of test-suite only fixes in this category) but perhaps that is too confusing.

cc @pydata/xarray for opinions!

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes isort -rc . && black . && mypy . && flake8
  • [ ] Fully documented, including whats-new.rst for user visible changes (including notable bug fixes) and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4039/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
639334065 MDExOlB1bGxSZXF1ZXN0NDM0OTQ0NTc4 4159 Test RTD's new pull request builder shoyer 1217238 closed 0     1 2020-06-16T03:06:32Z 2020-06-17T16:54:02Z 2020-06-17T16:54:02Z MEMBER   1 pydata/xarray/pulls/4159

https://docs.readthedocs.io/en/latest/guides/autobuild-docs-for-pull-requests.html

Don't merge this!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4159/reactions",
    "total_count": 3,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 3,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
639397110 MDExOlB1bGxSZXF1ZXN0NDM0OTk1NzQz 4160 Fix failing upstream-dev build & remove docs build shoyer 1217238 closed 0     0 2020-06-16T06:08:55Z 2020-06-16T06:35:49Z 2020-06-16T06:35:44Z MEMBER   0 pydata/xarray/pulls/4160

Instead, we'll use RTD's new doc builder instead. For an example, click on "docs/readthedocs.org:xray" below or look at GH4159

  • [x] Closes https://github.com/pydata/xarray/issues/4146
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4160/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
612214951 MDExOlB1bGxSZXF1ZXN0NDEzMjIyOTEx 4028 Remove broken test for Panel with to_pandas() shoyer 1217238 closed 0     5 2020-05-04T22:41:42Z 2020-05-06T01:50:21Z 2020-05-06T01:50:21Z MEMBER   0 pydata/xarray/pulls/4028

We don't support creating a Panel with to_pandas() with any version of pandas at present, so this test was previous broken if pandas < 0.25 was installed.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4028/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
612838635 MDExOlB1bGxSZXF1ZXN0NDEzNzA3Mzgy 4032 Allow warning with cartopy in docs plotting build shoyer 1217238 closed 0     1 2020-05-05T19:25:11Z 2020-05-05T21:49:26Z 2020-05-05T21:49:26Z MEMBER   0 pydata/xarray/pulls/4032

Fixes https://github.com/pydata/xarray/issues/4030

It looks like this is triggered by the new cartopy version now being installed on RTD (version 0.17.0 -> 0.18.0).

Long term we should fix this, but for now it's better just to disable the warning.

Here's the message from RTD: `` Exception occurred: File "/home/docs/checkouts/readthedocs.org/user_builds/xray/conda/latest/lib/python3.8/site-packages/IPython/sphinxext/ipython_directive.py", line 586, in process_input raise RuntimeError('Non Expected warning in{}line {}'.format(filename, lineno)) RuntimeError: Non Expected warning in/home/docs/checkouts/readthedocs.org/user_builds/xray/checkouts/latest/doc/plotting.rst` line 732 The full traceback has been saved in /tmp/sphinx-err-qav6jjmm.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at https://github.com/sphinx-doc/sphinx/issues. Thanks!


Warning in /home/docs/checkouts/readthedocs.org/user_builds/xray/checkouts/latest/doc/plotting.rst at block ending on line 732 Specify :okwarning: as an option in the ipython:: block to suppress this message


/home/docs/checkouts/readthedocs.org/user_builds/xray/checkouts/latest/xarray/plot/facetgrid.py:373: UserWarning: Tight layout not applied. The left and right margins cannot be made large enough to accommodate all axes decorations. self.fig.tight_layout() <<<------------------------------------------------------------------------- ``` https://readthedocs.org/projects/xray/builds/10969146/

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4032/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
612262200 MDExOlB1bGxSZXF1ZXN0NDEzMjYwNTY2 4029 Support overriding existing variables in to_zarr() without appending shoyer 1217238 closed 0     2 2020-05-05T01:06:40Z 2020-05-05T19:28:02Z 2020-05-05T19:28:02Z MEMBER   0 pydata/xarray/pulls/4029

This is nice for consistency with to_netcdf. It should be useful for cases where users want to update values in existing Zarr datasets.

  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4029/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
187625917 MDExOlB1bGxSZXF1ZXN0OTI1MjQzMjg= 1087 WIP: New DataStore / Encoder / Decoder API for review shoyer 1217238 closed 0     8 2016-11-07T05:02:04Z 2020-04-17T18:37:45Z 2020-04-17T18:37:45Z MEMBER   0 pydata/xarray/pulls/1087

The goal here is to make something extensible that we can live with for quite some time, and to clean up the internals of xarray's backend interface.

Most of these are analogues of existing xarray classes with a cleaned up interface. I have not yet worried about backwards compatibility or tests -- I would appreciate feedback on the approach here.

Several parts of the logic exist for the sake of dask. I've included the word "dask" in comments to facilitate inspection by mrocklin.

CC @rabernat, @pwolfram, @jhamman, @mrocklin -- for review

CC @mcgibbon, @JoyMonteiro -- this is relevant to our discussion today about adding support for appending to netCDF files. Don't let this stop you from getting started on that with the existing interface, though.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1087/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
557219435 MDExOlB1bGxSZXF1ZXN0MzY4ODQ5ODk0 3729 Remove garbage text inserted in DASK_LICENSE shoyer 1217238 closed 0     1 2020-01-30T01:46:47Z 2020-01-30T03:32:54Z 2020-01-30T03:32:51Z MEMBER   0 pydata/xarray/pulls/3729

I'm not sure how this got here, but it was probably my fault at one point :)

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes isort -rc . && black . && mypy . && && flake8
  • [ ] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3729/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 1,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
511640945 MDExOlB1bGxSZXF1ZXN0MzMxODAyMjE3 3439 Use cftime master for upstream-dev build shoyer 1217238 closed 0     1 2019-10-24T00:40:50Z 2019-10-24T01:28:20Z 2019-10-24T01:28:20Z MEMBER   0 pydata/xarray/pulls/3439

Follow-up on #3436, needed now that https://github.com/Unidata/cftime/pull/127 has been merged.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3439/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
505661688 MDExOlB1bGxSZXF1ZXN0MzI3MDQ3NzQ3 3393 BUG: overrides to a dimension coordinate do not get aligned shoyer 1217238 closed 0     2 2019-10-11T06:22:42Z 2019-10-11T15:48:02Z 2019-10-11T15:47:58Z MEMBER   0 pydata/xarray/pulls/3393

I really should have known better than to remove this check -- there was a whole comment I had written explaining why it was there! I guess this is a lesson in why it's always important to write regression tests.

  • [x] Closes #3377
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3393/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
482663274 MDExOlB1bGxSZXF1ZXN0MzA4ODkwMzc4 3234 Explicitly keep track of indexes with merging shoyer 1217238 closed 0     4 2019-08-20T06:11:55Z 2019-10-04T04:43:12Z 2019-10-04T04:42:50Z MEMBER   0 pydata/xarray/pulls/3234

Part of the explicit indexes refactor (https://github.com/pydata/xarray/issues/1603)

No user facing changes.

  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3234/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
495380049 MDExOlB1bGxSZXF1ZXN0MzE4OTM4NDY5 3319 Fix isel performance regression shoyer 1217238 closed 0     0 2019-09-18T18:15:08Z 2019-09-18T18:33:16Z 2019-09-18T18:33:16Z MEMBER   0 pydata/xarray/pulls/3319

xref #2227

Before: indexing.BooleanIndexing.time_indexing 898±0ms

After indexing.BooleanIndexing.time_indexing 401±0ms

  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3319/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
494943997 MDExOlB1bGxSZXF1ZXN0MzE4NTk1NDE3 3316 Clarify that "scatter" is a plotting method in what's new. shoyer 1217238 closed 0     3 2019-09-18T02:02:22Z 2019-09-18T03:47:46Z 2019-09-18T03:46:35Z MEMBER   0 pydata/xarray/pulls/3316

When I read this, I thought it was referring to scattering data somehow :).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3316/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
479914290 MDExOlB1bGxSZXF1ZXN0MzA2NzExNDYx 3210 sparse=True option for from_dataframe and from_series shoyer 1217238 closed 0     5 2019-08-13T01:09:19Z 2019-08-27T16:04:13Z 2019-08-27T08:54:26Z MEMBER   0 pydata/xarray/pulls/3210

Fixes https://github.com/pydata/xarray/issues/3206

Example usage:

In [3]: import pandas as pd
   ...: import numpy as np
   ...: import xarray
   ...: df = pd.DataFrame({
   ...:     'w': range(10),
   ...:     'x': list('abcdefghij'),
   ...:     'y': np.arange(0, 100, 10),
   ...:     'z': np.ones(10),
   ...: }).set_index(['w', 'x', 'y'])
   ...:

In [4]: ds = xarray.Dataset.from_dataframe(df, sparse=True)

In [5]: ds.z.data
Out[5]: <COO: shape=(10, 10, 10), dtype=float64, nnz=10, fill_value=nan>
  • [x] Closes #3206, Closes #2139
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3210/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
484636272 MDExOlB1bGxSZXF1ZXN0MzEwNDczMTM5 3254 Fix duck array ops that were calling bottleneck on sparse arrays shoyer 1217238 closed 0     0 2019-08-23T17:31:11Z 2019-08-24T05:30:26Z 2019-08-24T05:08:57Z MEMBER   0 pydata/xarray/pulls/3254

min and max are now working.

notnull was already fixed by one of my earlier PRs.

std/var/median are still broken, but only because sparse hasn't implemented the corresponding NumPy functions yet (nanstd, nanvar and nanmedian).

rank needs pure NumPy implementation (not via bottleneck) if we want it to work on sparse or dask arrays.

  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3254/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
483017588 MDExOlB1bGxSZXF1ZXN0MzA5MTc3NzM3 3235 Fix xarray's test suite with the dask master shoyer 1217238 closed 0     2 2019-08-20T18:35:03Z 2019-08-20T22:25:23Z 2019-08-20T22:25:07Z MEMBER   0 pydata/xarray/pulls/3235

We shouldn't be checking the details of dask's repr.

  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3235/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
479932490 MDExOlB1bGxSZXF1ZXN0MzA2NzI1MTYx 3211 Array formatting fixes for sparse and NEP-18 arrays. shoyer 1217238 closed 0     4 2019-08-13T02:41:37Z 2019-08-16T19:14:04Z 2019-08-16T19:12:28Z MEMBER   0 pydata/xarray/pulls/3211

I also did a bit of cleanup (e.g., renaming methods) in xarray.core.formatting.

Sparse arrays were previously not shown in the Dataset repr:

<xarray.Dataset>
Dimensions:  (x: 4)
Coordinates:
    y        (x) int64 ...
Dimensions without coordinates: x
Data variables:
    a        (x) float64 ..."""

Now they are:

<xarray.Dataset>
Dimensions:  (x: 4)
Coordinates:
    y        (x) int64 <COO: shape=(4,), nnz=3, fill_value=0>
Dimensions without coordinates: x
Data variables:
    a        (x) float64 <COO: shape=(4,), nnz=4, fill_value=0.0>"""
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3211/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
479412396 MDExOlB1bGxSZXF1ZXN0MzA2MzA5ODM5 3204 Remove duck_array_ops.as_like_arrays() shoyer 1217238 closed 0     1 2019-08-11T21:15:25Z 2019-08-12T15:06:47Z 2019-08-12T15:06:08Z MEMBER   0 pydata/xarray/pulls/3204

It has some questionable coercion logic that no longer seems to be necessary.

Not a user facing change.

  • [x] Passes black . && mypy . && flake8
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3204/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
467978990 MDExOlB1bGxSZXF1ZXN0Mjk3NDk5NDcx 3132 Internal clean-up of isnull() to avoid relying on pandas shoyer 1217238 closed 0     2 2019-07-15T07:31:32Z 2019-08-05T03:29:20Z 2019-08-05T03:29:20Z MEMBER   0 pydata/xarray/pulls/3132

This version should be much more compatible out of the box with duck typing.

No user facing changes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3132/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
471177659 MDExOlB1bGxSZXF1ZXN0Mjk5OTQ0MzI3 3157 Temporarily remove pynio from py36 CI build shoyer 1217238 closed 0     0 2019-07-22T16:29:39Z 2019-07-22T16:44:55Z 2019-07-22T16:44:52Z MEMBER   0 pydata/xarray/pulls/3157

This should get things building again.

xref https://github.com/pydata/xarray/issues/3154

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3157/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
469891383 MDExOlB1bGxSZXF1ZXN0Mjk5MDEyOTQz 3143 Remove the matplotlib=3.0 constraint from py36.yml shoyer 1217238 closed 0     1 2019-07-18T17:16:22Z 2019-07-18T17:39:59Z 2019-07-18T17:39:59Z MEMBER   0 pydata/xarray/pulls/3143

The upstream issue that required the constraint seems to have been fixed: https://github.com/barronh/pseudonetcdf/issues/69

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3143/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
467799744 MDExOlB1bGxSZXF1ZXN0Mjk3MzcwNTAw 3122 Tell codecov that Azure is a CI provider shoyer 1217238 closed 0     0 2019-07-14T06:10:33Z 2019-07-14T08:02:52Z 2019-07-14T08:02:48Z MEMBER   0 pydata/xarray/pulls/3122

In theory, this should make codecov wait until reports from Azure are back (and require that all Azure checks pass) before posting its comment.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3122/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
464787713 MDExOlB1bGxSZXF1ZXN0Mjk0OTkzMzMw 3082 Cache root netCDF4.Dataset objects instead of groups shoyer 1217238 closed 0     4 2019-07-05T22:26:19Z 2019-07-10T16:01:45Z 2019-07-10T16:01:38Z MEMBER   0 pydata/xarray/pulls/3082

NetCDF-C and HDF5 are not threads-safe, so it's likely that closing the file object associated with one group could invalidate other open groups from the same file.

Now, we cache a single object corresponding to the root group for each file, and access sub-groups on the fly as needed.

  • [x] Closes https://github.com/pydata/xarray/issues/2954
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3082/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
464888388 MDExOlB1bGxSZXF1ZXN0Mjk1MDYxNTk0 3087 Suppress warnings and add test coverage shoyer 1217238 closed 0     4 2019-07-06T20:33:24Z 2019-07-10T01:16:53Z 2019-07-10T01:16:49Z MEMBER   0 pydata/xarray/pulls/3087
  • Suppressed various warnings.
  • PsuedoNetCDF was not getting tested. Now it is.
  • Added test coverage for print_versions.py
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3087/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
464796480 MDExOlB1bGxSZXF1ZXN0Mjk0OTk5OTM0 3084 One CI build for upstream dev versions shoyer 1217238 closed 0     1 2019-07-05T23:40:00Z 2019-07-06T20:02:41Z 2019-07-06T19:54:47Z MEMBER   0 pydata/xarray/pulls/3084

Using pre-built wheels for NumPy and pandas, so hopefully things will install with reasonable speed.

Adapted from https://github.com/dask/dask/blob/2.0.0/continuous_integration/travis/install.sh

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3084/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 1,
    "eyes": 0
}
    xarray 13221727 pull
464798268 MDExOlB1bGxSZXF1ZXN0Mjk1MDAxMjYz 3085 Fix codecov reports shoyer 1217238 closed 0     2 2019-07-05T23:58:56Z 2019-07-06T05:12:25Z 2019-07-06T05:01:11Z MEMBER   0 pydata/xarray/pulls/3085

The location of the XML output was being changed by pytest-azurepipelines.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3085/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
421879216 MDExOlB1bGxSZXF1ZXN0MjYxODE2ODMw 2816 More explicit index handling in dataset.py shoyer 1217238 closed 0     0 2019-03-17T03:57:19Z 2019-07-05T17:49:59Z 2019-07-05T17:49:59Z MEMBER   0 pydata/xarray/pulls/2816

More progress towards https://github.com/pydata/xarray/issues/1603 without any user facing changes.

The only part that's left is explicit index handling in merge.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2816/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
463037547 MDExOlB1bGxSZXF1ZXN0MjkzNTk2MzM5 3072 Remove Travis-CI in favor of only using Azure Pipelines shoyer 1217238 closed 0     0 2019-07-02T06:06:31Z 2019-07-04T22:10:24Z 2019-07-04T22:10:19Z MEMBER   0 pydata/xarray/pulls/3072

Azure seems to be working pretty well, e.g., we now use it for test coverage with codecov. I don't see any particular reason to keep Travis around at this point.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3072/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
463021297 MDExOlB1bGxSZXF1ZXN0MjkzNTgzNDI1 3070 Fix the ability to run network and flaky tests shoyer 1217238 closed 0     10 2019-07-02T05:02:44Z 2019-07-04T20:51:29Z 2019-07-04T20:51:29Z MEMBER   0 pydata/xarray/pulls/3070

The old setup didn't seem to work on CI, even when we explicitly passed the relevant flags.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3070/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
463363939 MDExOlB1bGxSZXF1ZXN0MjkzODU3NjM1 3075 Add FAQ entry clarifying what parts of xarray are public API shoyer 1217238 closed 0     0 2019-07-02T18:03:20Z 2019-07-04T03:24:41Z 2019-07-04T03:24:37Z MEMBER   0 pydata/xarray/pulls/3075

xref https://github.com/Unidata/MetPy/issues/1077

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3075/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
463010338 MDExOlB1bGxSZXF1ZXN0MjkzNTc0NzA1 3069 Cleanup uses of super() to use Python 3 only syntax shoyer 1217238 closed 0     1 2019-07-02T04:11:21Z 2019-07-02T15:48:16Z 2019-07-02T15:48:16Z MEMBER   0 pydata/xarray/pulls/3069

No user facing changes

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3069/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
463029506 MDExOlB1bGxSZXF1ZXN0MjkzNTg5ODg3 3071 Try installing pytest-azurepipelines shoyer 1217238 closed 0     2 2019-07-02T05:37:45Z 2019-07-02T13:34:13Z 2019-07-02T06:04:14Z MEMBER   0 pydata/xarray/pulls/3071

See https://github.com/tonybaloney/pytest-azurepipelines

No user facing changes

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3071/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462365620 MDExOlB1bGxSZXF1ZXN0MjkzMDc3OTM0 3063 Fix another OS X test that can be flakey on Azure shoyer 1217238 closed 0     0 2019-06-30T04:17:30Z 2019-07-02T03:35:23Z 2019-07-02T03:35:20Z MEMBER   0 pydata/xarray/pulls/3063

See https://github.com/pydata/xarray/pull/3058/checks?check_run_id=158702766 for an example failure

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3063/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462364974 MDExOlB1bGxSZXF1ZXN0MjkzMDc3NDk4 3061 Internal cleanup in xarray.backends.file_manager shoyer 1217238 closed 0     0 2019-06-30T04:04:09Z 2019-07-02T03:34:58Z 2019-07-02T03:34:58Z MEMBER   0 pydata/xarray/pulls/3061

No user facing changes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3061/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462517912 MDExOlB1bGxSZXF1ZXN0MjkzMTg0Nzkz 3067 Tweak codecov.io configuration shoyer 1217238 closed 0     0 2019-07-01T05:33:22Z 2019-07-01T16:31:38Z 2019-07-01T16:31:33Z MEMBER   0 pydata/xarray/pulls/3067
  • Exclude _version.py from coverage, since it was generated from versioneer.
  • Remove the "reach" graph from comments.
  • Don't give PRs a failing status if they decrease coverage.
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3067/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462365083 MDExOlB1bGxSZXF1ZXN0MjkzMDc3NTc5 3062 Mark xarray as "Production" in setup.py rather than "eta" shoyer 1217238 closed 0     0 2019-06-30T04:06:17Z 2019-07-01T15:40:23Z 2019-07-01T15:40:23Z MEMBER   0 pydata/xarray/pulls/3062

Apparently some users have IT polices prohibiting the use of "beta" software!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3062/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462239596 MDExOlB1bGxSZXF1ZXN0MjkyOTk0OTMz 3058 Ensure xarray imports (and docs build) even without pandas.Panel shoyer 1217238 closed 0     2 2019-06-28T23:34:07Z 2019-06-30T18:01:13Z 2019-06-30T18:01:07Z MEMBER   0 pydata/xarray/pulls/3058

It's being removed in the next pandas 0.25 release (see https://github.com/pandas-dev/pandas/pull/27101).

  • [x] Closes #3057
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3058/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462369398 MDExOlB1bGxSZXF1ZXN0MjkzMDgwNDA1 3064 Upload coverage from Azure to codecov.io shoyer 1217238 closed 0     1 2019-06-30T05:30:34Z 2019-06-30T06:55:33Z 2019-06-30T06:55:33Z MEMBER   0 pydata/xarray/pulls/3064
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3064/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462329726 MDExOlB1bGxSZXF1ZXN0MjkzMDU2MTA4 3059 Fix test suite use of str(exception) shoyer 1217238 closed 0     1 2019-06-29T18:45:58Z 2019-06-30T05:53:12Z 2019-06-29T19:23:09Z MEMBER   0 pydata/xarray/pulls/3059

This fixes test failures on master, as noted in #2706

  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3059/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462353985 MDExOlB1bGxSZXF1ZXN0MjkzMDcwNTcx 3060 DOC: reorganize whats-new for 0.12.2 shoyer 1217238 closed 0     0 2019-06-30T00:32:15Z 2019-06-30T03:31:40Z 2019-06-30T03:31:35Z MEMBER   0 pydata/xarray/pulls/3060

xref https://github.com/pydata/xarray/issues/2977

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3060/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
459602791 MDExOlB1bGxSZXF1ZXN0MjkwOTA3MzI5 3040 Fix rolling window operations with dask when bottleneck is installed shoyer 1217238 closed 0     0 2019-06-23T18:11:16Z 2019-06-28T16:49:09Z 2019-06-28T16:49:04Z MEMBER   0 pydata/xarray/pulls/3040

Previously, these operations could silently return incorrect results (dask 2.0), or use unbounded amounts of memory (older versions of dask).

This requires a fairly large refactoring, because deciding when to use bottleneck now needs to be done at runtime rather than at import-time. These methods are now constructed as methods rather being injected aftewards into the class, which should also be a much more standard and understable design.

  • [x] Closes #2940
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3040/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
462101540 MDExOlB1bGxSZXF1ZXN0MjkyODg0MjUz 3055 Better add-conda-to-path template for azure pipelines shoyer 1217238 closed 0     1 2019-06-28T15:55:32Z 2019-06-28T16:07:06Z 2019-06-28T16:07:06Z MEMBER   0 pydata/xarray/pulls/3055

Template time expansion should reduce the noise in the CI output.

Follow on to https://github.com/pydata/xarray/pull/3039

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3055/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
459569339 MDExOlB1bGxSZXF1ZXN0MjkwODg0OTY5 3039 Set up CI with Azure Pipelines (and remove Appveyor) shoyer 1217238 closed 0     7 2019-06-23T12:16:56Z 2019-06-28T14:44:53Z 2019-06-27T20:44:12Z MEMBER   0 pydata/xarray/pulls/3039

xref https://github.com/astropy/astropy/pull/8445

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3039/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
461771791 MDExOlB1bGxSZXF1ZXN0MjkyNjIxNzYw 3052 Replace Appveyor with Azure Pipelines in README and contributor guide shoyer 1217238 closed 0     1 2019-06-27T22:12:11Z 2019-06-28T00:37:08Z 2019-06-28T00:37:05Z MEMBER   0 pydata/xarray/pulls/3052
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3052/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
458719321 MDExOlB1bGxSZXF1ZXN0MjkwMjYyNjUw 3036 Raise an error when doing rolling window operations with dask shoyer 1217238 closed 0     1 2019-06-20T15:16:41Z 2019-06-23T18:13:33Z 2019-06-23T18:12:16Z MEMBER   0 pydata/xarray/pulls/3036

xref #2940, #2942

  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3036/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
456963929 MDExOlB1bGxSZXF1ZXN0Mjg4ODcwMDQ0 3027 Ensure explicitly indexed arrays are preserved shoyer 1217238 closed 0     3 2019-06-17T14:21:18Z 2019-06-23T16:53:11Z 2019-06-23T16:49:23Z MEMBER   0 pydata/xarray/pulls/3027

Fixes https://github.com/pydata/xarray/issues/3009

Previously, indexing an ImplicitToExplicitIndexingAdapter object could directly return an ExplicitlyIndexed object, which could not be indexed normally, e.g., x[index] could result in an object that could not be indexed properly. This resulted in broken behavior with dask's new _meta attribute.

I'm pretty sure this fix is appropriate, but it does introduce two failing tests with xarray on dask master. In particular, there are now errors raised inside two tests from dask's blockwise_meta helper function: ```

  return meta.astype(dtype)

E AttributeError: 'ImplicitToExplicitIndexingAdapter' object has no attribute 'astype' ```

cc @mrocklin @pentschev

  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3027/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
427452658 MDExOlB1bGxSZXF1ZXN0MjY2MDQ5Mzcz 2859 More consistency checks shoyer 1217238 closed 0     4 2019-03-31T22:04:12Z 2019-06-18T14:05:09Z 2019-06-18T14:05:09Z MEMBER   0 pydata/xarray/pulls/2859

The first commit here is just https://github.com/pydata/xarray/pull/2858/files

It appears that we have our work cut-out for us -- about 180 failing tests if we turn on these more rigorous checks by default: ``` collected 8308 items

xarray/tests/test_accessors.py ............................................................................... [ 0%] .................................................................. [ 1%] xarray/tests/test_backends.py ...............................x..F.................F........................... [ 2%] ......................x..F.................F......................F.FFFF.FFFFF.FFFFFFF......F.FxFFFF..F....... [ 4%] s...F.F.F.xxxxFFF.FFFF.FFFFF.FFFFFFF......F.FxFFFF..F.......s...F.F.F.xxxxFF..............................x..F [ 5%] .............................................x..F...............ss..............................x..F.......... [ 6%] .......................................x..F..........F...................................x..F................. [ 8%] ...........................x..F..F..............................................x..F.................F........ [ 9%] .......FF..................................x..F.................F...............FF......F..................... [ 10%] ........x..F.................F...............FF......................................................sssssssss [ 11%] sssssss.....................................x......F........ssssssssssssssssssssssssssssssssssssssssssssssssss [ 13%] ssssssssssssssssssssssssssssss.......................................................................... [ 14%] xarray/tests/test_backends_api.py . [ 14%] xarray/tests/test_backends_file_manager.py ........................... [ 14%] xarray/tests/test_backends_locks.py . [ 14%] xarray/tests/test_backends_lru_cache.py ........ [ 14%] xarray/tests/test_cftime_offsets.py .......................................................................... [ 15%] .............................................................................................................. [ 17%] .............................................................................................................. [ 18%] .............................................................................................................. [ 19%] .............................................................................................................. [ 21%] .............................................................................................................. [ 22%] .............................................................................................................. [ 23%] .............................................................................................................. [ 25%] .............................................................................................................. [ 26%] .............................................................................................................. [ 27%] .............................................................................................................. [ 29%] .............................................................................................................. [ 30%] .............................................................................................................. [ 31%] .............................................................................................................. [ 33%] .............................................................................................................. [ 34%] .............................................................................................................. [ 35%] .............................................................................................................. [ 37%] .............................................................................................................. [ 38%] .............................................................................................................. [ 39%] ........ [ 39%] xarray/tests/test_cftimeindex.py ............................................................................. [ 40%] .............................................................................................................. [ 42%] .............................................................................................................. [ 43%] .............................................................................................................. [ 44%] .............................................................................................................. [ 46%] .......................................................................................x [ 47%] xarray/tests/test_cftimeindex_resample.py .................................................................... [ 47%] .............................................................................................................. [ 49%] .............................................................................................................. [ 50%] .............................................................................................................. [ 51%] .............................................................................................................. [ 53%] ................................................................................... [ 54%] xarray/tests/test_coding.py ......... [ 54%] xarray/tests/test_coding_strings.py ................... [ 54%] xarray/tests/test_coding_times.py ............................................................................ [ 55%] .............................................................................................................. [ 56%] .............................................................................................................. [ 58%] .............................. [ 58%] xarray/tests/test_combine.py F.................................................x [ 59%] xarray/tests/test_computation.py ................................. [ 59%] xarray/tests/test_conventions.py ..............................................x..F............s... [ 60%] xarray/tests/test_dask.py ................................................x................... [ 61%] xarray/tests/test_dataarray.py ............F.......................................................FF......... [ 62%] ................................s.s...F...................FFFF........................sss..................... [ 63%] .............................................................................................................. [ 64%] .............................................................................................................. [ 66%] .............................................................................................................. [ 67%] ..........................................................................................................ssss [ 68%] sssssssss [ 68%] xarray/tests/test_dataset.py ..........................F..............F...FF....................F.F........... [ 69%] ...s.FFFFF...................FF..............F..............................FFF.F.FFF.FFFF..FFFFFFF...FFFF..FF [ 71%] FFFF....F.....FFFFFF.......................................................................................... [ 72%] .............................................................................................................. [ 73%] ......................................................................................ssssssssssssssss........ [ 75%] .............................................................................................................. [ 76%] .............................................................................................................. [ 77%] ........................................................................................... [ 78%] xarray/tests/test_distributed.py FFFFFFFFFFsFsFss.FF [ 79%] xarray/tests/test_dtypes.py ....................................... [ 79%] xarray/tests/test_duck_array_ops.py .....................................ss..............ss..............ss... [ 80%] ...ss......ss..............ss......................................................ss......ss................. [ 81%] .............ss..............ss..............ss......ss......ss..............ss............................... [ 83%] .......................ss......ss................ssssssssss..........ssssssssss..........ssssssssss..........s [ 84%] sssssssss..........ssssss..............ssssss..............ssssss..............ssssss..............sssssssssss [ 85%] .s.s.s.s.sssssssssss.s.s.s.s.sssssssssss.s.s.s.s.sssssssssss.s.s.s.s.sssssss.s.s.s.s.s.s.sssssss.s.s.s.s.s.s.s [ 87%] ssssss.s.s.s.s.s.s.sssssss.s.s.s.s.s.s........................................................................ [ 88%] .................................. [ 88%] xarray/tests/test_extensions.py .... [ 88%] xarray/tests/test_formatting.py ............... [ 88%] xarray/tests/test_groupby.py ........ [ 89%] xarray/tests/test_indexing.py ................................................................................ [ 90%] ...................................... [ 90%] xarray/tests/test_interp.py FFFFFFFssFF..FF.........FF...FFFFFx.F.F.... [ 90%] xarray/tests/test_merge.py ................. [ 91%] xarray/tests/test_merge.py ................. [ 91%] xarray/tests/test_missing.py ................................. [ 91%] xarray/tests/test_nputils.py ... [ 91%] xarray/tests/test_options.py ...........x [ 91%] xarray/tests/test_plot.py .................................................................................... [ 12%] .............................................................................................................s [ 28%] .....................................................................................sss...................... [ 44%] ......................... [ 48%] xarray/tests/test_testing.py . [ 48%] xarray/tests/test_tutorial.py ss [ 48%] xarray/tests/test_ufuncs.py FFF..................................................................... [ 59%] xarray/tests/test_utils.py ............................ [ 63%] xarray/tests/test_variable.py ................................................................................ [ 74%] ................................................................................xxxX.......................... [ 91%] .......................................xxxxxxx............... [100%]

============= 180 failed, 7733 passed, 356 skipped, 38 xfailed, 1 xpassed, 10 warnings in 465.51 seconds ============= ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2859/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
442316508 MDExOlB1bGxSZXF1ZXN0Mjc3NDU1ODYz 2952 Less verbose tests on Appveyor shoyer 1217238 closed 0     0 2019-05-09T16:09:32Z 2019-05-13T17:08:10Z 2019-05-13T17:08:10Z MEMBER   0 pydata/xarray/pulls/2952

We don't want 10000 lines printing every single test :)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2952/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
437996498 MDExOlB1bGxSZXF1ZXN0Mjc0MTQxMDY5 2925 Attempt to fix py35 build on Travis shoyer 1217238 closed 0     1 2019-04-28T00:15:14Z 2019-05-04T06:15:55Z 2019-05-04T06:15:54Z MEMBER   0 pydata/xarray/pulls/2925

This build is currently install NumPy 1.11, which isn't supported by xarray. Maybe adding minimum numpy and pandas versions will help.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2925/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
430272298 MDExOlB1bGxSZXF1ZXN0MjY4MTk1NzU3 2878 Fix mypy typing error in cftime_offsets.py shoyer 1217238 closed 0     0 2019-04-08T06:14:47Z 2019-04-08T06:42:47Z 2019-04-08T06:42:31Z MEMBER   0 pydata/xarray/pulls/2878
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2878/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
427451138 MDExOlB1bGxSZXF1ZXN0MjY2MDQ4MzEw 2858 Various fixes for explicit Dataset.indexes shoyer 1217238 closed 0     5 2019-03-31T21:48:47Z 2019-04-04T22:59:48Z 2019-04-04T21:58:24Z MEMBER   0 pydata/xarray/pulls/2858

I've added internal consistency checks to the uses of assert_equal in our test suite, so this shouldn't happen again.

  • [x] Closes #2856, closes #2854
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2858/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
424274061 MDExOlB1bGxSZXF1ZXN0MjYzNjYzOTA5 2845 Fix indexes created by Dataset.swap_dims shoyer 1217238 closed 0     0 2019-03-22T15:43:28Z 2019-03-25T02:30:29Z 2019-03-25T02:30:29Z MEMBER   0 pydata/xarray/pulls/2845
  • [x] Closes #2842
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2845/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
422023339 MDExOlB1bGxSZXF1ZXN0MjYxOTA2MjI4 2818 DOC: remove outdated warning shoyer 1217238 closed 0     0 2019-03-18T03:35:26Z 2019-03-20T19:07:59Z 2019-03-20T19:07:59Z MEMBER   0 pydata/xarray/pulls/2818

This deprecation was finished in v0.11.0.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2818/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
420840205 MDExOlB1bGxSZXF1ZXN0MjYxMDMxMjA0 2809 Push back finalizing deprecations for 0.12 shoyer 1217238 closed 0     2 2019-03-14T05:39:46Z 2019-03-15T04:22:11Z 2019-03-15T04:22:10Z MEMBER   0 pydata/xarray/pulls/2809

0.12 will already have a big change in dropping Python 2.7 support. I'd rather wait a bit longer to finalize these deprecations to minimize the impact on users.

xref https://github.com/pydata/xarray/issues/2776

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2809/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
420840237 MDExOlB1bGxSZXF1ZXN0MjYxMDMxMjI5 2810 Drop failing tests writing multi-dimensional arrays as attributes shoyer 1217238 closed 0     0 2019-03-14T05:39:59Z 2019-03-14T15:59:18Z 2019-03-14T15:59:13Z MEMBER   0 pydata/xarray/pulls/2810

These aren't valid for netCDF files.

Fixes #2803

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2810/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
395332265 MDExOlB1bGxSZXF1ZXN0MjQxODExMjc4 2642 Use pycodestyle for lint checks. shoyer 1217238 closed 0     6 2019-01-02T18:11:38Z 2019-03-14T06:27:20Z 2019-01-03T18:10:13Z MEMBER   0 pydata/xarray/pulls/2642

flake8 includes a few more useful checks, but it's annoying to only see it's output in Travis-CI results.

This keeps Travis-CI and pep8speaks in sync.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2642/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
408019919 MDExOlB1bGxSZXF1ZXN0MjUxMzYyMjA3 2756 Update computation.py to use Python 3 function signatures shoyer 1217238 closed 0     1 2019-02-08T06:11:13Z 2019-02-12T05:39:37Z 2019-02-12T05:39:37Z MEMBER   0 pydata/xarray/pulls/2756

This lets us remove lots of ugly explicit calls to kwargs.pop().

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2756/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
407830312 MDExOlB1bGxSZXF1ZXN0MjUxMjE1MjU3 2753 Fix mypy errors shoyer 1217238 closed 0     0 2019-02-07T18:12:53Z 2019-02-08T04:45:33Z 2019-02-08T04:45:32Z MEMBER   0 pydata/xarray/pulls/2753

Apparently I wasn't paying attention in my last PR :)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2753/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
401467505 MDExOlB1bGxSZXF1ZXN0MjQ2MzgwNjk0 2696 Refactor (part of) dataset.py to use explicit indexes shoyer 1217238 closed 0     1 2019-01-21T18:28:27Z 2019-02-06T16:07:42Z 2019-02-06T16:07:39Z MEMBER   0 pydata/xarray/pulls/2696

This is part of the larger project in https://github.com/pydata/xarray/issues/1603

None of this should change public APIs: we're simply making updating indexes explicit (via the indexes dict) rather than implicit (through variables).

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2696/reactions",
    "total_count": 2,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 1,
    "eyes": 0
}
    xarray 13221727 pull
406049184 MDExOlB1bGxSZXF1ZXN0MjQ5ODUzNTIz 2739 Reenable cross engine read write netCDF test shoyer 1217238 closed 0     0 2019-02-03T03:44:07Z 2019-02-04T04:42:17Z 2019-02-04T04:42:17Z MEMBER   0 pydata/xarray/pulls/2739

Fixes https://github.com/pydata/xarray/issues/2050

I'm not quite sure what was going on, but it passes now.

  • [x] Closes #2050
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2739/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
365961291 MDExOlB1bGxSZXF1ZXN0MjE5NzUyOTE3 2458 WIP: sketch of resample support for CFTimeIndex shoyer 1217238 closed 0     5 2018-10-02T15:44:36Z 2019-02-03T03:21:52Z 2019-02-03T03:21:52Z MEMBER   0 pydata/xarray/pulls/2458

Example usage:

```

import xarray times = xarray.cftime_range('2000', periods=30, freq='MS') da = xarray.DataArray(range(30), [('time', times)]) da.resample(time='1AS').mean() <xarray.DataArray (time: 3)> array([ 5.5, 17.5, 26.5]) Coordinates: * time (time) object 2001-01-01 00:00:00 ... 2003-01-01 00:00:00 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2458/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 1,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
403521700 MDExOlB1bGxSZXF1ZXN0MjQ3OTM2NjM2 2720 Fix test failures / warnings for pandas 0.24 shoyer 1217238 closed 0     0 2019-01-27T07:08:59Z 2019-01-27T21:02:04Z 2019-01-27T21:02:03Z MEMBER   0 pydata/xarray/pulls/2720
  • [x] Closes #2717
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2720/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
403489769 MDExOlB1bGxSZXF1ZXN0MjQ3OTE3NTk5 2718 DOC: refresh whats-new for 0.11.3 / 0.12.0 shoyer 1217238 closed 0     0 2019-01-26T22:15:54Z 2019-01-27T17:11:48Z 2019-01-27T17:11:48Z MEMBER   0 pydata/xarray/pulls/2718

I just pushed the 0.11.3 bug-fix release to pypi.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2718/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
403299287 MDExOlB1bGxSZXF1ZXN0MjQ3Nzg3NTE5 2708 Update environment for doc build shoyer 1217238 closed 0     0 2019-01-25T19:22:21Z 2019-01-26T18:16:56Z 2019-01-26T18:14:50Z MEMBER   0 pydata/xarray/pulls/2708

We were pinning very old versions for most of these packages. This should fix the failures on ReadTheDocs after https://github.com/pydata/xarray/pull/2707 goes in.

  • [x] Closes #2705
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2708/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
403299817 MDExOlB1bGxSZXF1ZXN0MjQ3Nzg3OTEy 2709 Print full environment fron conf.py shoyer 1217238 closed 0     1 2019-01-25T19:23:57Z 2019-01-25T21:52:18Z 2019-01-25T21:52:18Z MEMBER   0 pydata/xarray/pulls/2709

This should make it easier to debug the doc build environment.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2709/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
403288066 MDExOlB1bGxSZXF1ZXN0MjQ3Nzc4MzQ2 2707 BUG: ensure indexes are reset when coords are modified shoyer 1217238 closed 0     0 2019-01-25T18:57:31Z 2019-01-25T19:55:41Z 2019-01-25T19:55:07Z MEMBER   0 pydata/xarray/pulls/2707

This was introduced by the recent indexes refactor, but never made it into a release.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2707/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
402802055 MDExOlB1bGxSZXF1ZXN0MjQ3NDAzNTI2 2704 Write docs for 0.11.3 release shoyer 1217238 closed 0     0 2019-01-24T16:54:21Z 2019-01-24T19:06:08Z 2019-01-24T19:04:12Z MEMBER   0 pydata/xarray/pulls/2704

I'll port these to master separately, given that this commit probably won't apply cleanly.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2704/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
399257829 MDExOlB1bGxSZXF1ZXN0MjQ0NzI2NTE5 2675 Fix test failures with numpy=1.16 shoyer 1217238 closed 0     0 2019-01-15T09:33:00Z 2019-01-15T11:19:59Z 2019-01-15T11:19:59Z MEMBER   0 pydata/xarray/pulls/2675

Fixes https://github.com/pydata/xarray/issues/2673

Note that these were only test failures, not a real bug.

  • [x] Closes #2673
  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2675/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
398674031 MDExOlB1bGxSZXF1ZXN0MjQ0Mjg5MzEy 2669 xfail cftimeindex multiindex test shoyer 1217238 closed 0     1 2019-01-13T16:28:40Z 2019-01-13T17:11:40Z 2019-01-13T17:07:19Z MEMBER   0 pydata/xarray/pulls/2669

It was a nice idea to support CFTimeIndex in a pandas.MultiIndex, but pandas seems to have inadvertently broken this, see https://github.com/pandas-dev/pandas/issues/24263

(This should fix our failing CI tests with pandas 0.24 rc)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2669/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
396314764 MDExOlB1bGxSZXF1ZXN0MjQyNTMwNTgx 2657 DOC: refresh "Why xarray" and shorten top-level description shoyer 1217238 closed 0     1 2019-01-07T01:02:59Z 2019-01-11T01:06:11Z 2019-01-11T01:06:10Z MEMBER   0 pydata/xarray/pulls/2657

This documentation revamp builds upon @rabernat's rewrite in #2430.

The main change is that the three paragraph description felt too long to me, so I moved the background paragraph on multi-dimensional arrays into the next section, on "Why xarray". I also ended up rewriting most of that page, and made a few adjustments to the FAQ and related projects pages.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2657/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
396989505 MDExOlB1bGxSZXF1ZXN0MjQzMDM5NzQ4 2661 Remove broken Travis-CI builds shoyer 1217238 closed 0     3 2019-01-08T16:40:24Z 2019-01-08T18:34:04Z 2019-01-08T18:34:00Z MEMBER   0 pydata/xarray/pulls/2661

Remove the optional condaforge-rc, netcdf4-dev and pynio-dev builds. These have been continuously failing (due to broken installs), so we shouldn't waste time/energy running them.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2661/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
396241341 MDExOlB1bGxSZXF1ZXN0MjQyNDgzOTQx 2655 Type checking with mypy shoyer 1217238 closed 0     4 2019-01-06T09:08:11Z 2019-01-08T07:21:41Z 2019-01-08T07:21:41Z MEMBER   0 pydata/xarray/pulls/2655

The rest of the scientific Python stack doesn't seem to support type annotations yet, but that's OK -- we can use this incrementally in xarray when it seems appropriate, and may check a few bugs. I'm especially excited to use this for internal functions, where we don't always bother with full docstrings (e.g., what is the type of the variables argument?).

This includes: 1. various minor fixes to ensure that "mypy xarray" passes. 2. ~~adding "mypy xarray" to our lint check on Travis-CI.~~

For reference, see "Using mypy with an existing codebase": https://mypy.readthedocs.io/en/stable/existing_code.html

Question: are we OK with (2)? This means Travis-CI will fail if your code causes mypy to error.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2655/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
312077122 MDExOlB1bGxSZXF1ZXN0MTgwMDI0Mjky 2041 Add reference on MCVE to GitHub issue template shoyer 1217238 closed 0     0 2018-04-06T18:41:34Z 2019-01-06T06:17:43Z 2018-04-11T21:38:50Z MEMBER   0 pydata/xarray/pulls/2041
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2041/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
395045813 MDExOlB1bGxSZXF1ZXN0MjQxNjExMTU4 2639 ENH: switch Dataset and DataArray to use explicit indexes shoyer 1217238 closed 0     2 2019-01-01T00:48:42Z 2019-01-04T21:07:45Z 2019-01-04T17:15:34Z MEMBER   0 pydata/xarray/pulls/2639

xref https://github.com/pydata/xarray/issues/1603

This change switches Dataset.indexes and DataArray.indexes to be backed by explicit dictionaries of indexes, instead of being implicitly defined by the set of coordinates with names matching dimensions.

There are no changes to the public interface yet: these will come later.

My current plan: 1. (This PR) Indexes are recreated from coordinates every time a new DataArray or Dataset is created. 2. (Follow-up PRs) Refactor indexes to be propagated explicitly in xarray operations. This will facilitate future API changes, when indexes will no longer only be associated with dimensions. I will probably add some testing decorator that can be used to mark part of a test as including no creation of default indexes. 3. Add explicit entries into indexes for MultiIndex levels that are checked instead of MultiIndex variables. Still no public API changes (aside from adding more entries to .indexes). 4. Support arbitrary coordinates in indexes.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2639/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
395336962 MDExOlB1bGxSZXF1ZXN0MjQxODE0ODY3 2643 BUG: pytest-runner not required for setup.py shoyer 1217238 closed 0     2 2019-01-02T18:29:36Z 2019-01-03T01:14:38Z 2019-01-03T01:14:38Z MEMBER   0 pydata/xarray/pulls/2643
  • [x] Closes #2641
  • [ ] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2643/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
395041455 MDExOlB1bGxSZXF1ZXN0MjQxNjA4MTIx 2638 TST: silence warnings from bottleneck shoyer 1217238 closed 0     0 2018-12-31T23:27:18Z 2018-12-31T23:48:41Z 2018-12-31T23:48:35Z MEMBER   0 pydata/xarray/pulls/2638

These were adding a lot of noise to the output of our test suite -- something like 500 lines of useless warnings!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2638/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
388977754 MDExOlB1bGxSZXF1ZXN0MjM3MTAyNjYz 2595 Close files when CachingFileManager is garbage collected shoyer 1217238 closed 0     5 2018-12-09T01:53:50Z 2018-12-23T20:11:35Z 2018-12-23T20:11:32Z MEMBER   0 pydata/xarray/pulls/2595

This frees users from needing to worry about this.

Using __del__ turned up to be easier than using weak references.

  • [x] Closes #2560
  • [x] Closes #2614
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new AP
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2595/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
392259525 MDExOlB1bGxSZXF1ZXN0MjM5NTcxMjQ5 2617 Remove failing Appveyor Python 2.7 32-bit build shoyer 1217238 closed 0     0 2018-12-18T17:27:38Z 2018-12-19T03:57:03Z 2018-12-19T03:56:59Z MEMBER   0 pydata/xarray/pulls/2617

There seems to be some sort of dependency issue on Appveyor, but it's not worth tracking down given how we'll be dropping Python 2.7 in the new year anyways.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2617/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 128.841ms · About: xarray-datasette