home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

43 rows where comments = 3, repo = 13221727 and user = 2448579 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 31
  • issue 12

state 2

  • closed 41
  • open 2

repo 1

  • xarray · 43 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1989227042 PR_kwDOAMm_X85fObtL 8445 Pin pint to >=0.22 dcherian 2448579 closed 0     3 2023-11-12T03:58:40Z 2023-11-13T19:39:54Z 2023-11-13T19:39:53Z MEMBER   0 pydata/xarray/pulls/8445
  • [x] Closes https://github.com/pydata/xarray/issues/7971
  • [x] Closes https://github.com/pydata/xarray/issues/8437.

We were previously pinned to <0.21 Removing the pin didn't change the env but with >=0.21 we get 0.22 which works.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8445/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1952621896 I_kwDOAMm_X850YqVI 8337 Support rolling with numbagg dcherian 2448579 open 0     3 2023-10-19T16:11:40Z 2023-10-23T15:46:36Z   MEMBER      

Is your feature request related to a problem?

We can do plain reductions, and groupby reductions with numbagg. Rolling is the last one left!

I don't think coarsen will benefit since it's basically a reshape and reduce on that view, so it should already be accelerated. There may be small gains in handling the boundary conditions but that's probably it.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8337/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
1944347086 PR_kwDOAMm_X85c2nyz 8316 Enable numbagg for reductions dcherian 2448579 closed 0     3 2023-10-16T04:46:10Z 2023-10-18T14:54:48Z 2023-10-18T10:39:30Z MEMBER   0 pydata/xarray/pulls/8316
  • [ ] Tests added - check bottleneck tests
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8316/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1736542260 PR_kwDOAMm_X85R6fac 7888 Add cfgrib,ipywidgets to doc env dcherian 2448579 closed 0     3 2023-06-01T15:11:10Z 2023-06-16T14:14:01Z 2023-06-16T14:13:59Z MEMBER   0 pydata/xarray/pulls/7888
  • [x] Closes #7841
  • [x] Closes #7892
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7888/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1689773381 PR_kwDOAMm_X85PctlP 7798 Fix groupby binary ops when grouped array is subset relative to other dcherian 2448579 closed 0     3 2023-04-30T04:14:14Z 2023-05-03T12:58:35Z 2023-05-02T14:48:42Z MEMBER   0 pydata/xarray/pulls/7798
  • [x] Closes #7797
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

cc @slevang

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7798/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1632422255 PR_kwDOAMm_X85Md6iW 7650 Pin pandas < 2 dcherian 2448579 closed 0     3 2023-03-20T16:03:58Z 2023-04-25T13:42:48Z 2023-03-22T14:53:53Z MEMBER   0 pydata/xarray/pulls/7650

Pandas is expecting to release v2 in two weeks (pandas-dev/pandas#46776 (comment)). But we are still incompatible with their main branch: - #7441 - #7420

This PR pins pandas to <2

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7650/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1610063645 PR_kwDOAMm_X85LTHGz 7586 Fix lazy negative slice rewriting. dcherian 2448579 closed 0     3 2023-03-05T05:31:17Z 2023-03-27T21:05:54Z 2023-03-27T21:05:51Z MEMBER   0 pydata/xarray/pulls/7586

There was a bug in estimating the last index of the slice. Index a range object instead.

  • [x] Closes #6560
  • [x] Tests added
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7586/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1573030587 PR_kwDOAMm_X85JXRu7 7506 Fix whats-new for 2023.02.0 dcherian 2448579 closed 0     3 2023-02-06T18:01:17Z 2023-02-07T16:14:55Z 2023-02-07T16:14:51Z MEMBER   0 pydata/xarray/pulls/7506

Oops. I used "Github Codespaces" to edit whats-new, but it turns if you commit in there, it just commits to main!

This fixes the pre-commit error.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7506/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1433815234 PR_kwDOAMm_X85CF7j3 7249 whats-new for 2022.11.0 dcherian 2448579 closed 0     3 2022-11-02T21:35:13Z 2022-11-04T20:43:02Z 2022-11-04T20:43:00Z MEMBER   0 pydata/xarray/pulls/7249  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7249/reactions",
    "total_count": 3,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 3,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1333514579 I_kwDOAMm_X85Pe9FT 6902 Flox based groupby operations don't support `dtype` in mean method dcherian 2448579 closed 0     3 2022-08-09T16:38:25Z 2022-10-11T17:45:27Z 2022-10-11T17:45:27Z MEMBER      

Discussed in https://github.com/pydata/xarray/discussions/6901

<sup>Originally posted by **tasansal** August 9, 2022</sup> We have been using the new groupby logic with Flox and numpy_groupies; however, when we run the following, the dtype is not recognized as a valid argument. This breaks API compatibility for cases where you may not have the acceleration libraries installed. Not sure if this has to be upstream in In addition to base Xarray we have the following extras installed: Flox numpy_groupies Bottleneck We do this because our data is `float32` but we want the accumulator in mean to be `float64` for accuracy. One solution is to cast the variable to float64 before mean, which may cause a copy and spike in memory usage. When Flox and numpy_groupies are not installed, it works as expected. We are working with multi-dimensional time-series of weather forecast models. ```python da = xr.load_mfdataset(...) da.groupby("time.month").mean(dtype='float64').compute() ``` Here is the end of the traceback and it appears it is on Flox. ```shell File "/home/altay_sansal_tgs_com/miniconda3/envs/wind-data-mos/lib/python3.10/site-packages/flox/core.py", line 786, in _aggregate return _finalize_results(results, agg, axis, expected_groups, fill_value, reindex) File "/home/altay_sansal_tgs_com/miniconda3/envs/wind-data-mos/lib/python3.10/site-packages/flox/core.py", line 747, in _finalize_results finalized[agg.name] = agg.finalize(*squeezed["intermediates"], **agg.finalize_kwargs) TypeError: <lambda>() got an unexpected keyword argument 'dtype' ``` What is the best way to handle this, maybe fix it in Flox?
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6902/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1217509109 PR_kwDOAMm_X8424hSf 6525 Add cumsum to DatasetGroupBy dcherian 2448579 closed 0     3 2022-04-27T15:19:20Z 2022-07-20T01:31:41Z 2022-07-20T01:31:37Z MEMBER   0 pydata/xarray/pulls/6525
  • [x] Closes #3141, Closes #3417
  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6525/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1298145215 I_kwDOAMm_X85NYB-_ 6763 Map_blocks should raise nice error if provided template has no dask arrays dcherian 2448579 closed 0     3 2022-07-07T21:58:06Z 2022-07-14T17:42:26Z 2022-07-14T17:42:26Z MEMBER      

Discussed in https://github.com/pydata/xarray/discussions/6762

<sup>Originally posted by **tlsw231** July 7, 2022</sup> I am trying to use `map_blocks` to: ingest a multi-dimensional array as input, reduce along one dimension and add extra dimensions to the output. Is this possible? I am attaching a simple MRE below that gives me an `zip argument #2 must support iteration` error. Any pointers on what I might be doing wrong? [My real example is a 3d-dataset with `(time,lat,lon)` dimensions and I am trying to reduce along `time` while adding two new dimensions to the output. I tried so many things and got so many errors, including the one in the title, that I thought it is better to first understand how `map_blocks` works!] ``` # The goal is to feed in a 2d array, reduce along one dimension and add two new dimensions to the output. chunks={} dummy = xr.DataArray(data=np.random.random([8,100]),dims=['dim1','dim2']).chunk(chunks) def some_func(func): dims=func.dims n1 = len(func[func.dims[1]]) # This is 'dim2', we will average along 'dim1' below in the for loop newdim1 = 2; newdim2 = 5; output = xr.DataArray(np.nan*np.ones([n1,newdim1,newdim2]),dims=[dims[1],'new1','new2']) for n in range(n1): fmean = func.isel(dim2=n).mean(dims[0]).compute() for i in range(newdim1): for j in range(newdim2): output[n,i,j] = fmean return output #out = some_func(dummy) # This works template=xr.DataArray(np.nan*np.ones([len(dummy.dim2),2,5]), dims=['dim2','new1','new2']) out = xr.map_blocks(some_func,dummy,template=template).compute() # gives me the error message in the title ``` [Edit: Fixed a typo in the `n1 = len(func[func.dims[1]])` line, of course getting the same error.]
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6763/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1189140909 I_kwDOAMm_X85G4Nmt 6434 concat along dim with mix of scalar coordinate and array coordinates is not right dcherian 2448579 closed 0     3 2022-04-01T02:29:16Z 2022-04-06T01:19:47Z 2022-04-06T01:19:47Z MEMBER      

What happened?

Really hard to describe in words =)

concat = xr.concat([da.isel(time=0), da.isel(time=[1])], dim="time") xr.align(concat, da, dim="time")

fails when concat and da should be identical. This is causing failures in cf-xarray:https://github.com/xarray-contrib/cf-xarray/issues/319

cc @benbovy

What did you expect to happen?

No response

Minimal Complete Verifiable Example

```Python import numpy as np import xarray as xr

time = xr.DataArray( np.array( ["2013-01-01T00:00:00.000000000", "2013-01-01T06:00:00.000000000"], dtype="datetime64[ns]", ), dims="time", name="time", )

da = time concat = xr.concat([da.isel(time=0), da.isel(time=[1])], dim="time") xr.align(da, concat, join="exact") # works

da = xr.DataArray(np.ones(time.shape), dims="time", coords={"time": time}) concat = xr.concat([da.isel(time=0), da.isel(time=[1])], dim="time") xr.align(da, concat, join="exact") ```

Relevant log output

```

ValueError Traceback (most recent call last) Input In [27], in <module> 17 da = xr.DataArray(np.ones(time.shape), dims="time", coords={"time": time}) 18 concat = xr.concat([da.isel(time=0), da.isel(time=[1])], dim="time") ---> 19 xr.align(da, concat, join="exact")

File ~/work/python/xarray/xarray/core/alignment.py:761, in align(join, copy, indexes, exclude, fill_value, *objects) 566 """ 567 Given any number of Dataset and/or DataArray objects, returns new 568 objects with aligned indexes and dimension sizes. (...) 751 752 """ 753 aligner = Aligner( 754 objects, 755 join=join, (...) 759 fill_value=fill_value, 760 ) --> 761 aligner.align() 762 return aligner.results

File ~/work/python/xarray/xarray/core/alignment.py:549, in Aligner.align(self) 547 self.find_matching_unindexed_dims() 548 self.assert_no_index_conflict() --> 549 self.align_indexes() 550 self.assert_unindexed_dim_sizes_equal() 552 if self.join == "override":

File ~/work/python/xarray/xarray/core/alignment.py:395, in Aligner.align_indexes(self) 393 if need_reindex: 394 if self.join == "exact": --> 395 raise ValueError( 396 "cannot align objects with join='exact' where " 397 "index/labels/sizes are not equal along " 398 "these coordinates (dimensions): " 399 + ", ".join(f"{name!r} {dims!r}" for name, dims in key[0]) 400 ) 401 joiner = self._get_index_joiner(index_cls) 402 joined_index = joiner(matching_indexes)

ValueError: cannot align objects with join='exact' where index/labels/sizes are not equal along these coordinates (dimensions): 'time' ('time',) ```

Anything else we need to know?

No response

Environment

xarray main

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6434/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1171932478 I_kwDOAMm_X85F2kU- 6373 Zarr backend should avoid checking for invalid encodings dcherian 2448579 closed 0     3 2022-03-17T04:55:35Z 2022-03-18T10:06:01Z 2022-03-18T04:19:48Z MEMBER      

What is your issue?

The zarr backend has a list of "valid" encodings that needs to be updated any time zarr adds something new (e.g. https://github.com/pydata/xarray/pull/6348)

https://github.com/pydata/xarray/blob/53172cb1e03a98759faf77ef48efaa64676ad24a/xarray/backends/zarr.py#L215-L234

Can we get rid of this? I don't know the backends code well, but won't all our encoding parameters have been removed by this stage? The raise_on_invalid kwarg suggests so.

@tomwhite points out that zarr will raise a warning: ``` python

zarr.create((1,), blah=1) /Users/tom/miniconda/envs/sgkit-3.8/lib/python3.8/site-packages/zarr/creation.py:221: UserWarning: ignoring keyword argument 'blah' warn('ignoring keyword argument %r' % k) <zarr.core.Array (1,) float64> ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6373/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1167962844 PR_kwDOAMm_X840X9mp 6353 Add new tutorial video dcherian 2448579 closed 0     3 2022-03-14T06:54:37Z 2022-03-16T03:52:54Z 2022-03-16T03:49:23Z MEMBER   0 pydata/xarray/pulls/6353  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6353/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1038407848 PR_kwDOAMm_X84tyqLp 5904 Update docstring for apply_ufunc, set_options dcherian 2448579 closed 0     3 2021-10-28T11:33:03Z 2021-10-30T14:10:24Z 2021-10-30T14:10:23Z MEMBER   0 pydata/xarray/pulls/5904  
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5904/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
922792799 MDExOlB1bGxSZXF1ZXN0NjcxNjI1NDc3 5474 Refactor out coarsen tests dcherian 2448579 closed 0     3 2021-06-16T15:52:57Z 2021-06-21T17:04:02Z 2021-06-21T16:35:36Z MEMBER   0 pydata/xarray/pulls/5474
  • xref #5409
  • [x] Tests added
  • [x] Passes pre-commit run --all-files

Some questions: 1. flake8 fails with some false positives. What do we do about that? 2. I am importing the da and ds fitures from test_dataarray and test_dataset. Is that the pattern we want to follow?

xarray/tests/test_coarsen.py:9:1: F401 '.test_dataarray.da' imported but unused xarray/tests/test_coarsen.py:10:1: F401 '.test_dataset.ds' imported but unused xarray/tests/test_coarsen.py:13:36: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:20:26: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:35:25: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:51:5: F811 redefinition of unused 'da' from line 9 xarray/tests/test_coarsen.py:62:5: F811 redefinition of unused 'da' from line 9 xarray/tests/test_coarsen.py:84:5: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:156:5: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:186:25: F811 redefinition of unused 'ds' from line 10 xarray/tests/test_coarsen.py:217:5: F811 redefinition of unused 'da' from line 9 xarray/tests/test_coarsen.py:269:5: F811 redefinition of unused 'da' from line 9

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5474/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
501108453 MDU6SXNzdWU1MDExMDg0NTM= 3363 user-friendly additions for dask usage dcherian 2448579 closed 0     3 2019-10-01T19:48:27Z 2021-04-19T03:34:18Z 2021-04-19T03:34:18Z MEMBER      

Any thoughts on adding

  1. .chunksize or .nbytes_chunk
  2. .ntasks : this would be len(obj.__dask_graph__())
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3363/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
694182591 MDExOlB1bGxSZXF1ZXN0NDgwNTk3OTk3 4407 Dataset.plot.quiver dcherian 2448579 closed 0     3 2020-09-05T21:04:05Z 2021-02-19T14:21:47Z 2021-02-19T14:21:45Z MEMBER   0 pydata/xarray/pulls/4407
  • [x] Closes #4373
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [x] New functions/methods are listed in api.rst

I could use some help with adding tests and parameter checking if someone wants to help :)

``` python import numpy as np import xarray as xr

ds = xr.Dataset() ds.coords["x"] = ("x", np.arange(10)) ds.coords["y"] = ("y", np.arange(20)) ds.coords["t"] = ("t", np.arange(4)) ds["u"] = np.sin((ds.x - 5) / 5) * np.sin((ds.y - 10) / 10) ds["v"] = np.sin((ds.x - 5) / 5) * np.cos((ds.y - 10) / 10)

ds = ds * 2*np.cos((ds.t) * 2 * 3.14 /0.75)

ds["u"].attrs["units"] = "m/s" ds["mag"] = np.hypot(ds.u, ds.v)

ds.mag.plot(col="t", x="x")

fg = ds.plot.quiver(x="x", y="y", u="u", v="v", col="t", hue="mag") ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4407/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
685825824 MDU6SXNzdWU2ODU4MjU4MjQ= 4376 wrong chunk sizes in html repr with nonuniform chunks dcherian 2448579 open 0     3 2020-08-25T21:23:11Z 2020-10-07T11:11:23Z   MEMBER      

What happened:

The HTML repr is using the first element in a chunks tuple;

What you expected to happen:

it should be using whatever dask does in this case

Minimal Complete Verifiable Example:

```python

import xarray as xr import dask

test = xr.DataArray( dask.array.zeros( (12, 901, 1001), chunks=( (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1), (1, 899, 1), (1, 199, 1, 199, 1, 199, 1, 199, 1, 199, 1), ), ) ) test.to_dataset(name="a") ```

EDIT: The text repr has the same issue <xarray.Dataset> Dimensions: (dim_0: 12, dim_1: 901, dim_2: 1001) Dimensions without coordinates: dim_0, dim_1, dim_2 Data variables: a (dim_0, dim_1, dim_2) float64 dask.array<chunksize=(1, 1, 1), meta=np.ndarray>

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4376/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
623230375 MDExOlB1bGxSZXF1ZXN0NDIxOTM3MTk2 4088 Fix conversion of multiindexed pandas objects to sparse xarray objects dcherian 2448579 closed 0     3 2020-05-22T13:59:21Z 2020-05-26T22:20:06Z 2020-05-26T22:20:02Z MEMBER   0 pydata/xarray/pulls/4088
  • [x] Closes #4019
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

~Doesn't have a proper test. Need some help here~. cc @bnaul

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4088/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
589834306 MDExOlB1bGxSZXF1ZXN0Mzk1MjgyNzg0 3915 facetgrid: Ensure that colormap params are only determined once. dcherian 2448579 closed 0     3 2020-03-29T16:52:28Z 2020-04-11T16:12:00Z 2020-04-11T16:11:53Z MEMBER   0 pydata/xarray/pulls/3915
  • [x] Closes #3569
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

Not sure how to test this.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3915/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
584429748 MDU6SXNzdWU1ODQ0Mjk3NDg= 3867 macos py38 CI failing dcherian 2448579 closed 0     3 2020-03-19T13:54:10Z 2020-03-29T22:13:26Z 2020-03-29T22:13:26Z MEMBER      

import matplotlib is failing when it imports PIL

```python

E ImportError: dlopen(/usr/local/miniconda/envs/xarray-tests/lib/python3.8/site-packages/PIL/_imaging.cpython-38-darwin.so, 2): Library not loaded: @rpath/libwebp.7.dylib E Referenced from: /usr/local/miniconda/envs/xarray-tests/lib/libtiff.5.dylib E Reason: Incompatible library version: libtiff.5.dylib requires version 9.0.0 or later, but libwebp.7.dylib provides version 8.0.0

/usr/local/miniconda/envs/xarray-tests/lib/python3.8/site-packages/PIL/Image.py:69: ImportError ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3867/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
576912650 MDExOlB1bGxSZXF1ZXN0Mzg0ODA1NjIy 3839 Fix alignment with join="override" when some dims are unindexed dcherian 2448579 closed 0     3 2020-03-06T12:52:50Z 2020-03-13T13:59:25Z 2020-03-13T13:25:13Z MEMBER   0 pydata/xarray/pulls/3839
  • [x] Closes #3681
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3839/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
558241968 MDExOlB1bGxSZXF1ZXN0MzY5NjY5NTM3 3738 Add twine check and readthedocs reminder to HOW_TO_RELEASE dcherian 2448579 closed 0     3 2020-01-31T16:43:39Z 2020-02-24T20:39:03Z 2020-02-24T18:52:04Z MEMBER   0 pydata/xarray/pulls/3738
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3738/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
538521262 MDExOlB1bGxSZXF1ZXN0MzUzNjUzNTEz 3629 apply_ufunc vectorize 1D function example dcherian 2448579 closed 0     3 2019-12-16T16:33:36Z 2020-01-16T18:06:42Z 2020-01-15T15:25:57Z MEMBER   0 pydata/xarray/pulls/3629

I added an example on using apply_ufunc to vectorize a 1D example over a DataArray. Comments and feedback welcome.

I added an example of using numba's guvectorize too. Thoughts on keeping that bit?

  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

cc @rabernat @TomNicholas

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3629/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
499477368 MDU6SXNzdWU0OTk0NzczNjg= 3350 assert_equal and dask dcherian 2448579 closed 0     3 2019-09-27T14:25:14Z 2020-01-10T16:10:57Z 2020-01-10T16:10:57Z MEMBER      

MCVE Code Sample

Example 1

```python import xarray as xr import numpy as np

da = xr.DataArray(np.random.randn(10, 20), name="a") ds = da.to_dataset() xr.testing.assert_equal(da, da.chunk({"dim_0": 2})) # works xr.testing.assert_equal(da.chunk(), da.chunk({"dim_0": 2})) # works

xr.testing.assert_equal(ds, ds.chunk({"dim_0": 2})) # works xr.testing.assert_equal(ds.chunk(), ds.chunk({"dim_0": 2})) # does not work

``` I get

```

ValueError Traceback (most recent call last) <ipython-input-1-bc8216a67408> in <module> 8 9 xr.testing.assert_equal(ds, ds.chunk({"dim_0": 2})) # works ---> 10 xr.testing.assert_equal(ds.chunk(), ds.chunk({"dim_0": 2})) # does not work

~/work/python/xarray/xarray/testing.py in assert_equal(a, b) 56 assert a.equals(b), formatting.diff_array_repr(a, b, "equals") 57 elif isinstance(a, Dataset): ---> 58 assert a.equals(b), formatting.diff_dataset_repr(a, b, "equals") 59 else: 60 raise TypeError("{} not supported by assertion comparison".format(type(a)))

~/work/python/xarray/xarray/core/dataset.py in equals(self, other) 1322 """ 1323 try: -> 1324 return self._all_compat(other, "equals") 1325 except (TypeError, AttributeError): 1326 return False

~/work/python/xarray/xarray/core/dataset.py in _all_compat(self, other, compat_str) 1285 1286 return self._coord_names == other._coord_names and utils.dict_equiv( -> 1287 self._variables, other._variables, compat=compat 1288 ) 1289

~/work/python/xarray/xarray/core/utils.py in dict_equiv(first, second, compat) 335 """ 336 for k in first: --> 337 if k not in second or not compat(first[k], second[k]): 338 return False 339 for k in second:

~/work/python/xarray/xarray/core/dataset.py in compat(x, y) 1282 # require matching order for equality 1283 def compat(x: Variable, y: Variable) -> bool: -> 1284 return getattr(x, compat_str)(y) 1285 1286 return self._coord_names == other._coord_names and utils.dict_equiv(

~/work/python/xarray/xarray/core/variable.py in equals(self, other, equiv) 1558 try: 1559 return self.dims == other.dims and ( -> 1560 self._data is other._data or equiv(self.data, other.data) 1561 ) 1562 except (TypeError, AttributeError):

~/work/python/xarray/xarray/core/duck_array_ops.py in array_equiv(arr1, arr2) 201 warnings.filterwarnings("ignore", "In the future, 'NAT == x'") 202 flag_array = (arr1 == arr2) | (isnull(arr1) & isnull(arr2)) --> 203 return bool(flag_array.all()) 204 205

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/array/core.py in bool(self) 1380 ) 1381 else: -> 1382 return bool(self.compute()) 1383 1384 nonzero = bool # python 2

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/base.py in compute(self, kwargs) 173 dask.base.compute 174 """ --> 175 (result,) = compute(self, traverse=False, kwargs) 176 return result 177

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/base.py in compute(args, kwargs) 444 keys = [x.dask_keys() for x in collections] 445 postcomputes = [x.dask_postcompute() for x in collections] --> 446 results = schedule(dsk, keys, kwargs) 447 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) 448

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, pool, kwargs) 80 get_id=_thread_get_id, 81 pack_exception=pack_exception, ---> 82 kwargs 83 ) 84

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs) 489 _execute_task(task, data) # Re-execute locally 490 else: --> 491 raise_exception(exc, tb) 492 res, worker_id = loads(res_info) 493 state["cache"][key] = res

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/compatibility.py in reraise(exc, tb) 128 if exc.traceback is not tb: 129 raise exc.with_traceback(tb) --> 130 raise exc 131 132 import pickle as cPickle

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 231 try: 232 task, data = loads(task_info) --> 233 result = _execute_task(task, data) 234 id = get_id() 235 result = dumps((result, id))

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/core.py in _execute_task(arg, cache, dsk) 117 func, args = arg[0], arg[1:] 118 args2 = [_execute_task(a, cache) for a in args] --> 119 return func(*args2) 120 elif not ishashable(arg): 121 return arg

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/optimization.py in call(self, *args) 1057 if not len(args) == len(self.inkeys): 1058 raise ValueError("Expected %d args, got %d" % (len(self.inkeys), len(args))) -> 1059 return core.get(self.dsk, self.outkey, dict(zip(self.inkeys, args))) 1060 1061 def reduce(self):

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/core.py in get(dsk, out, cache) 147 for key in toposort(dsk): 148 task = dsk[key] --> 149 result = _execute_task(task, cache) 150 cache[key] = result 151 result = _execute_task(out, cache)

~/miniconda3/envs/dcpy/lib/python3.7/site-packages/dask/core.py in _execute_task(arg, cache, dsk) 117 func, args = arg[0], arg[1:] 118 args2 = [_execute_task(a, cache) for a in args] --> 119 return func(*args2) 120 elif not ishashable(arg): 121 return arg

ValueError: operands could not be broadcast together with shapes (0,20) (2,20) ```

Example 2

The relevant xarray line in the previous traceback is flag_array = (arr1 == arr2) | (isnull(arr1) & isnull(arr2)), so I tried

python (ds.isnull() & ds.chunk({"dim_0": 1}).isnull()).compute() # works (ds.chunk().isnull() & ds.chunk({"dim_0": 1}).isnull()).compute() # does not work?!

```

ValueError Traceback (most recent call last) <ipython-input-4-abdfbeda355a> in <module> 1 (ds.isnull() & ds.chunk({"dim_0": 1}).isnull()).compute() # works ----> 2 (ds.chunk().isnull() & ds.chunk({"dim_0": 1}).isnull()).compute() # does not work?!

~/work/python/xarray/xarray/core/dataset.py in compute(self, kwargs) 791 """ 792 new = self.copy(deep=False) --> 793 return new.load(kwargs) 794 795 def _persist_inplace(self, **kwargs) -> "Dataset":

~/work/python/xarray/xarray/core/dataset.py in load(self, **kwargs) 645 646 for k, data in zip(lazy_data, evaluated_data): --> 647 self.variables[k].data = data 648 649 # load everything else sequentially

~/work/python/xarray/xarray/core/variable.py in data(self, data) 331 data = as_compatible_data(data) 332 if data.shape != self.shape: --> 333 raise ValueError("replacement data must match the Variable's shape") 334 self._data = data 335

ValueError: replacement data must match the Variable's shape ```

Problem Description

I don't know what's going on here. I expect assert_equal should return True for all these examples.

Our test for isnull with dask always calls load before comparing:

def test_isnull_with_dask(): da = construct_dataarray(2, np.float32, contains_nan=True, dask=True) assert isinstance(da.isnull().data, dask_array_type) assert_equal(da.isnull().load(), da.load().isnull())

Output of xr.show_versions()

xarray master & dask 2.3.0

# Paste the output here xr.show_versions() here INSTALLED VERSIONS ------------------ commit: 6ece6a1cf424c3080e216fad8fc8058d3b70aadc python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.15.0-64-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.4 libnetcdf: 4.6.2 xarray: 0.13.0+13.g6ece6a1c pandas: 0.25.1 numpy: 1.17.2 scipy: 1.3.1 netCDF4: 1.5.1.2 pydap: None h5netcdf: 0.7.4 h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: 0.9.7.1 iris: None bottleneck: 1.2.1 *dask: 2.3.0* distributed: 2.3.2 matplotlib: 3.1.1 cartopy: 0.17.0 seaborn: 0.9.0 numbagg: None setuptools: 41.2.0 pip: 19.2.3 conda: 4.7.11 pytest: 5.1.2 IPython: 7.8.0 sphinx: 2.2.0
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3350/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
452629448 MDU6SXNzdWU0NTI2Mjk0NDg= 2999 median on dask arrays dcherian 2448579 closed 0     3 2019-06-05T17:37:46Z 2019-12-30T17:46:44Z 2019-12-30T17:46:44Z MEMBER      

Dask has updated it's percentile, quantile implementation: https://github.com/dask/dask/pull/4677

Can we now update our median method to work with dask arrays?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2999/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
513068398 MDExOlB1bGxSZXF1ZXN0MzMyOTI3ODIy 3453 Optimize dask array equality checks. dcherian 2448579 closed 0     3 2019-10-28T02:44:14Z 2019-11-05T15:41:22Z 2019-11-05T15:41:15Z MEMBER   0 pydata/xarray/pulls/3453

Dask arrays with the same graph have the same name. We can use this to quickly compare dask-backed variables without computing. (see https://github.com/pydata/xarray/issues/3068#issuecomment-508853564)

I will work on adding extra tests but could use feedback on the approach.

  • [x] Closes #3068, closes #3311, closes #3454
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

@djhoese, thanks for the great example code!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3453/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
501150299 MDExOlB1bGxSZXF1ZXN0MzIzNDgwMDky 3364 Make concat more forgiving with variables that are being merged. dcherian 2448579 closed 0     3 2019-10-01T21:15:54Z 2019-10-17T01:30:32Z 2019-10-14T18:06:54Z MEMBER   0 pydata/xarray/pulls/3364
  • [x] Closes #508
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

Downstream issue: https://github.com/marbl-ecosys/cesm2-marbl/issues/1

Basically, we are currently raising an error when attempting to merge variables that are present in some datasets but not others that are provided to concat. This seems unnecessarily strict and it turns out we had an issue for it! (#508)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3364/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
501059947 MDExOlB1bGxSZXF1ZXN0MzIzNDA1OTEz 3362 Fix concat bug when concatenating unlabeled dimensions. dcherian 2448579 closed 0     3 2019-10-01T18:10:22Z 2019-10-08T22:30:38Z 2019-10-08T22:13:48Z MEMBER   0 pydata/xarray/pulls/3362

This fixes the following behaviour. (downstream issue https://github.com/xgcm/xgcm/issues/154)

def test_concat(self, data): split_data = [ data.isel(dim1=slice(3)), data.isel(dim1=3), # this wouldn't work on master data.isel(dim1=slice(4, None)), ] assert_identical(data, concat(split_data, "dim1"))

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3362/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
439407583 MDExOlB1bGxSZXF1ZXN0Mjc1MjE5ODQ4 2934 Docs/more fixes dcherian 2448579 closed 0     3 2019-05-02T02:43:29Z 2019-10-04T19:43:44Z 2019-10-04T17:04:37Z MEMBER   0 pydata/xarray/pulls/2934
  • partially addresses #2909 , closes #2901, closes #2908
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2934/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 1,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
459401826 MDExOlB1bGxSZXF1ZXN0MjkwNzcxODIx 3038 Revert cmap fix dcherian 2448579 closed 0     3 2019-06-21T23:11:09Z 2019-08-15T15:32:42Z 2019-06-22T17:16:36Z MEMBER   0 pydata/xarray/pulls/3038

Unfortunately my fix in #2935 broke some major functionality. A proper fix would involve some facetgrid refactoring I think; so that'll take some time. This reverts that commit and adds a test.

xr.DataArray(np.random.randn(10, 20)).plot(levels=10, cmap=mpl.cm.RdBu)

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3038/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
467456554 MDExOlB1bGxSZXF1ZXN0Mjk3MTA1NDg1 3102 mfdataset, concat now support the 'join' kwarg. dcherian 2448579 closed 0     3 2019-07-12T14:52:25Z 2019-08-09T16:55:24Z 2019-08-07T12:17:07Z MEMBER   0 pydata/xarray/pulls/3102
  • [x] Closes #1354
  • [x] Tests added
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

I won't work on it for the next few days if someone else wants to take over...

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3102/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
362918990 MDExOlB1bGxSZXF1ZXN0MjE3NDk1OTc5 2433 Contour labels kwarg dcherian 2448579 closed 0     3 2018-09-23T06:52:15Z 2019-06-13T15:35:44Z 2019-06-13T15:35:44Z MEMBER   0 pydata/xarray/pulls/2433
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

Adds a labels boolean kwarg for contour that adds contour labels. Also adds clabel_kwargs that is passed on to Axes.clabel()

air.isel(time=0).plot.contour(labels=True, colors='k', clabel_kwargs={'fmt': '%.1f'})

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2433/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
386294605 MDExOlB1bGxSZXF1ZXN0MjM1MDcxOTQ5 2584 Fix parsing '_Unsigned' attribute dcherian 2448579 closed 0     3 2018-11-30T18:11:03Z 2019-04-12T16:29:22Z 2018-12-15T23:53:19Z MEMBER   0 pydata/xarray/pulls/2584
  • [x] Closes #2583
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2584/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
399042126 MDU6SXNzdWUzOTkwNDIxMjY= 2673 NaT tests need to be fixed on master dcherian 2448579 closed 0     3 2019-01-14T19:37:45Z 2019-01-15T16:54:56Z 2019-01-15T11:19:59Z MEMBER      

```python =================================== FAILURES =================================== ____ TestVariable.test_index_0d_not_a_time _______ self = <xarray.tests.test_variable.TestVariable object at 0x7f0dd7b6bda0> def test_index_0d_not_a_time(self): d = np.datetime64('NaT', 'ns') x = self.cls(['x'], [d])

  self._assertIndexedLikeNDArray(x, d)

xarray/tests/test_variable.py:197:


self = <xarray.tests.test_variable.TestVariable object at 0x7f0dd7b6bda0> variable = <xarray.Variable (x: 1)> array(['NaT'], dtype='datetime64[ns]') expected_value0 = numpy.datetime64('NaT'), expected_dtype = None def _assertIndexedLikeNDArray(self, variable, expected_value0, expected_dtype=None): """Given a 1-dimensional variable, verify that the variable is indexed like a numpy.ndarray. """ assert variable[0].shape == () assert variable[0].ndim == 0 assert variable[0].size == 1 # test identity assert variable.equals(variable.copy()) assert variable.identical(variable.copy()) # check value is equal for both ndarray and Variable with warnings.catch_warnings(): warnings.filterwarnings('ignore', "In the future, 'NAT == x'")

      assert variable.values[0] == expected_value0

E AssertionError: assert numpy.datetime64('NaT') == numpy.datetime64('NaT') E -numpy.datetime64('NaT') E +numpy.datetime64('NaT') xarray/tests/test_variable.py:143: AssertionError

____ TestVariableWithDask.test_index_0d_not_a_time _____ self = <xarray.tests.test_variable.TestVariableWithDask object at 0x7f0e00bc5978> def test_index_0d_not_a_time(self): d = np.datetime64('NaT', 'ns') x = self.cls(['x'], [d])

  self._assertIndexedLikeNDArray(x, d)

xarray/tests/test_variable.py:197:


self = <xarray.tests.test_variable.TestVariableWithDask object at 0x7f0e00bc5978> variable = <xarray.Variable (x: 1)> dask.array<shape=(1,), dtype=datetime64[ns], chunksize=(1,)> expected_value0 = numpy.datetime64('NaT'), expected_dtype = None def _assertIndexedLikeNDArray(self, variable, expected_value0, expected_dtype=None): """Given a 1-dimensional variable, verify that the variable is indexed like a numpy.ndarray. """ assert variable[0].shape == () assert variable[0].ndim == 0 assert variable[0].size == 1 # test identity assert variable.equals(variable.copy()) assert variable.identical(variable.copy()) # check value is equal for both ndarray and Variable with warnings.catch_warnings(): warnings.filterwarnings('ignore', "In the future, 'NAT == x'")

      assert variable.values[0] == expected_value0

E AssertionError: assert numpy.datetime64('NaT') == numpy.datetime64('NaT') E -numpy.datetime64('NaT') E +numpy.datetime64('NaT') xarray/tests/test_variable.py:143: AssertionError

___ TestIndexVariable.testindex_0d_not_a_time ____ self = <xarray.tests.test_variable.TestIndexVariable object at 0x7f0e01063390> def test_index_0d_not_a_time(self): d = np.datetime64('NaT', 'ns') x = self.cls(['x'], [d])

  self._assertIndexedLikeNDArray(x, d)

xarray/tests/test_variable.py:197:


self = <xarray.tests.test_variable.TestIndexVariable object at 0x7f0e01063390> variable = <xarray.IndexVariable 'x' (x: 1)> array(['NaT'], dtype='datetime64[ns]') expected_value0 = numpy.datetime64('NaT'), expected_dtype = None def _assertIndexedLikeNDArray(self, variable, expected_value0, expected_dtype=None): """Given a 1-dimensional variable, verify that the variable is indexed like a numpy.ndarray. """ assert variable[0].shape == () assert variable[0].ndim == 0 assert variable[0].size == 1 # test identity assert variable.equals(variable.copy()) assert variable.identical(variable.copy()) # check value is equal for both ndarray and Variable with warnings.catch_warnings(): warnings.filterwarnings('ignore', "In the future, 'NAT == x'")

      assert variable.values[0] == expected_value0

E AssertionError: assert numpy.datetime64('NaT') == numpy.datetime64('NaT') E -numpy.datetime64('NaT') E +numpy.datetime64('NaT') xarray/tests/test_variable.py:143: AssertionError ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2673/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
364405781 MDExOlB1bGxSZXF1ZXN0MjE4NjAwMzM5 2443 Properly support user-provided norm. dcherian 2448579 closed 0     3 2018-09-27T10:25:33Z 2018-10-08T05:23:47Z 2018-10-08T05:23:35Z MEMBER   0 pydata/xarray/pulls/2443
  • [x] Closes #2381
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2443/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
345493321 MDExOlB1bGxSZXF1ZXN0MjA0NjE5NjY0 2328 Silence some warnings. dcherian 2448579 closed 0     3 2018-07-29T01:46:27Z 2018-09-04T15:39:39Z 2018-09-04T15:39:23Z MEMBER   0 pydata/xarray/pulls/2328
  • [x] Tests passed (for all non-documentation changes)

Remove some warnings.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2328/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
341713032 MDExOlB1bGxSZXF1ZXN0MjAxNzg2NDgy 2294 Additional axis kwargs dcherian 2448579 closed 0     3 2018-07-16T23:25:37Z 2018-07-31T22:28:58Z 2018-07-31T22:28:44Z MEMBER   0 pydata/xarray/pulls/2294
  • [x] Closes #2224 (remove if there is no corresponding issue, which should only be the case for minor changes)
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

This PR adds support for xscale, yscale, xticks, yticks, xlim, ylim kwargs following discussion in #2224 and the Pandas API (https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.plot.html#pandas.DataFrame.plot).

Haven't added FacetGrid support yet. I'll get to that soon.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2294/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
331387539 MDU6SXNzdWUzMzEzODc1Mzk= 2224 Add axis scaling kwargs to DataArray.plot() dcherian 2448579 closed 0     3 2018-06-11T23:41:42Z 2018-07-31T22:28:44Z 2018-07-31T22:28:44Z MEMBER      

It would be useful to add the boolean kwargs logx, logy, loglog to DataArray.plot() just as in pandas.DataFrame.plot()

https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.plot.html#pandas.DataFrame.plot

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2224/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
345426020 MDExOlB1bGxSZXF1ZXN0MjA0NTgxMDA1 2325 interp() now accepts date strings as desired co-ordinate locations dcherian 2448579 closed 0     3 2018-07-28T07:14:22Z 2018-07-30T00:33:13Z 2018-07-29T06:09:41Z MEMBER   0 pydata/xarray/pulls/2325
  • [x] Closes #2284
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

```python da = xr.DataArray([1, 5], dims=['time'], coords={'time': [np.datetime64('2014-05-06'), np.datetime64('2014-05-10')]})

da.interp(time='2014-05-07') <xarray.DataArray ()> array(2.) Coordinates: time datetime64[ns] 2014-05-07 ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2325/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
276185101 MDExOlB1bGxSZXF1ZXN0MTU0MjQxMDUx 1737 WIP: 1d+2d coord plotting dcherian 2448579 closed 0     3 2017-11-22T19:43:34Z 2017-12-19T23:49:39Z 2017-11-29T11:50:09Z MEMBER   0 pydata/xarray/pulls/1737
  • [ ] Closes #xxxx
  • [x] Tests added / passed
  • [x] Passes git diff upstream/master **/*py | flake8 --diff
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

This PR teaches plot.contourf() to contour variables with both a 1D co-ordinate (e.g. time) and a 2D co-ordinate (e.g. time-varying depth) da: <xarray.DataArray 'S' (z: 5, time: 85646)> array([[ nan, nan, nan, ..., nan, nan, nan], [ 35.02816 , 34.729567, 34.779223, ..., 34.57513 , 34.671975, 34.334675], [ 35.206943, 35.163239, 34.938674, ..., 34.780728, 34.836331, 34.70386 ], [ nan, 35.184057, 35.10592 , ..., 34.656925, 34.776915, 34.429442], [ 34.85562 , 34.81994 , 35.00963 , ..., 35.014522, 34.9747 , 35.033848]]) Coordinates: depth (z, time) float16 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 53.5 ... * time (time) datetime64[ns] 2013-12-19T06:00:01 2013-12-19T06:10:01 ... Dimensions without coordinates: z

Now we can do da.plot(x='time', y='depth')

Couple of questions: 1. I've added a test, but am not sure how to work in an assert statement. 2. How do I test that I haven't messed up the syntax in whats-new.rst

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1737/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 95.804ms · About: xarray-datasette