home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

13 rows where comments = 4 and user = 14371165 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: draft, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 11
  • issue 2

state 2

  • closed 11
  • open 2

repo 1

  • xarray 13
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1916363233 PR_kwDOAMm_X85bYe7b 8241 Use strict type hinting for namedarray Illviljan 14371165 closed 0     4 2023-09-27T21:32:41Z 2024-02-02T18:12:22Z 2023-10-03T17:18:41Z MEMBER   0 pydata/xarray/pulls/8241

Towards the strict goal in #8239.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8241/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1318800553 I_kwDOAMm_X85Om0yp 6833 Require a pull request before merging to main Illviljan 14371165 closed 0     4 2022-07-26T22:09:55Z 2023-01-13T16:51:03Z 2023-01-13T16:51:03Z MEMBER      

Is your feature request related to a problem?

I was making sure the test in #6832 failed on main, when it did I wrote a few lines in the what's new file but forgot switching back to the other branch and accidentally pushed directly to main. :(

Describe the solution you'd like

I think it's best if we require a pull request for merging. We seem to pretty much do this anyway.

Seems to be this setting if I understand correctly:

Describe alternatives you've considered

No response

Additional context

No response

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6833/reactions",
    "total_count": 3,
    "+1": 3,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1462833576 PR_kwDOAMm_X85Dnp5w 7319 mypy - Remove some ignored packages and modules Illviljan 14371165 closed 0     4 2022-11-24T06:34:13Z 2022-11-26T15:39:12Z 2022-11-26T15:39:11Z MEMBER   0 pydata/xarray/pulls/7319

dask has added py.typed files so now the ignores shouldn't be needed anymore: https://github.com/dask/dask/pull/8854 https://github.com/dask/distributed/pull/5328 As does numpy and pint

pycompat.py doesn't error anymore, so it's good to type check that one as well. Fixed also a python 3.8 related error in it.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7319/reactions",
    "total_count": 2,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 2,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1452339775 PR_kwDOAMm_X85DEC8r 7296 Fix some typing errors in DuckArrayModule Illviljan 14371165 closed 0     4 2022-11-16T22:07:00Z 2022-11-20T18:52:12Z 2022-11-20T10:18:53Z MEMBER   0 pydata/xarray/pulls/7296

Fixes these errors that I've been seeing locally for a while:

python !mypy C:\Users\J.W\Documents\GitHub\xarray\xarray\core\coordinates.py --ignore-missing-imports C:\Users\J.W\Documents\GitHub\xarray\xarray\core\pycompat.py:48: error: Incompatible types in assignment (expression has type "Tuple[]", variable has type "Tuple[Any]") C:\Users\J.W\Documents\GitHub\xarray\xarray\core\pycompat.py:50: error: Incompatible types in assignment (expression has type "Optional[Module]", variable has type "Optional[Literal['dask', 'pint', 'cupy', 'sparse']]") Found 2 errors in 1 file (checked 1 source file)

Not sure why the CI isn't catching these?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7296/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1302483271 PR_kwDOAMm_X847SWhZ 6778 Add dataarray scatter Illviljan 14371165 closed 0     4 2022-07-12T18:44:29Z 2022-10-07T19:37:18Z 2022-10-07T15:43:29Z MEMBER   0 pydata/xarray/pulls/6778

Splitting up #5622 as the scope of it has grown too large now.

  • Adds support for dataarray scatter plots and replaces the dataset version.
  • Scatter now has 3d support with the z argument.
  • Scatter now always returns a single pathcollection, earlier it could return a list of pathcollection when using categoricals
  • Better legend, handles now categoricals. Making continous/discrete options slightly redundant, still there though.
  • Facetgrid generalized slightly to handle 3d plots.

TODO: * sharex sharey test https://github.com/pydata/xarray/pull/7047: len(list(ax.get_shared_x_axes())) != 0

  • [x] Tests added
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6778/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
1376156646 PR_kwDOAMm_X84_HFMo 7047 Set sharex/sharey to False if using 3d plots Illviljan 14371165 closed 0     4 2022-09-16T16:16:57Z 2022-09-25T15:33:29Z 2022-09-25T15:33:29Z MEMBER   0 pydata/xarray/pulls/7047

Matplotlibs 3d plots appears to not support sharex/sharey. So reset to default values instead. This improves the look of the plot as axis values aren't deleted.

Example: ```python import matplotlib.pyplot as plt

fig = plt.figure() subplot_kws = {"projection":"3d"} ax1 = fig.add_subplot(211, subplot_kws) ax1.plot([0, 1, 2], [5,6,6]) ax2 = fig.add_subplot(212, sharex = ax1, subplot_kws) ax2.plot([0, 1, 2], [5,4,2]) # x axis is not linked. ``` Split up from #6778.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7047/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
930918574 MDExOlB1bGxSZXF1ZXN0Njc4NTA3NDMw 5540 Cache some properties Illviljan 14371165 open 0     4 2021-06-27T12:22:16Z 2022-07-10T14:34:34Z   MEMBER   1 pydata/xarray/pulls/5540

Cache some small properties that are rather slow to calculate but doesn't change that often.

Questions that needs to be resolved:

  • [ ] Can these properties change during the lifetime of the class? If so the cache needs to be reset when that happens.
  • [ ] Related to #3514
  • [ ] Tests added
  • [ ] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst

Notes * Mixin classes makes it difficult to cache properties. For example ndim in NdimSizeLenMixin cannot be easily be replaced with cache_readonly.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5540/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
597785475 MDU6SXNzdWU1OTc3ODU0NzU= 3962 Interpolation - Support extrapolation method "clip" Illviljan 14371165 open 0     4 2020-04-10T09:07:13Z 2022-05-02T13:42:24Z   MEMBER      

Hello,

I would like an option in da.interp()that instead of returning NaNs during extrapolation returns the data corresponding to the end of the breakpoint data set range.

One way to do this is to limit the new coordinates to the array coordinates minimum and maximum value, I did a simple example with this solution down below. I think this is a rather safe way as we are just modifying the inputs to all the various interpolation classes that xarray is using at the moment. But it does look a little weird when printing the extrapolated value, the coordinates shows the limited value instead of the requested coordinates. Maybe this can be handled elegantly somewhere in the source code?

MATLAB uses this quite frequently in their interpolation functions: * https://mathworks.com/help/simulink/ug/methods-for-estimating-missing-points.html * https://mathworks.com/help/simulink/slref/2dlookuptable.html

MCVE Code Sample

```python import numpy as np import xarray as xr

def interp(da, coords, extrapolation='clip'): """ Linear interpolation that clips the inputs to the coords min and max value.

Parameters
----------
da : DataArray
    DataArray to interpolate.
coords : dict
    Coordinates for the interpolated value.
"""
if extrapolation == 'clip':
    for k, v in da.coords.items():
        coords[k] = np.maximum(coords[k], np.min(v.values))
        coords[k] = np.minimum(coords[k], np.max(v.values))

return da.interp(coords)

Create coordinates:

x = np.linspace(1000, 6000, 4) y = np.linspace(100, 1200, 3)

Create data:

X = np.meshgrid(*[x, y], indexing='ij') data = X[0] * X[1]

Create DataArray:

da = xr.DataArray(data=data, coords=[('x', x), ('y', y)], name='data')

Attempt to extrapolate:

datai = interp(da, {'x': 7000, 'y': 375}) ```

Expected Output

python print(datai) <xarray.DataArray 'data' ()> array(2250000.) Coordinates: x float64 6e+03 y float64 375.0

Versions

Output of `xr.show_versions()` INSTALLED VERSIONS ------------------ commit: None python: 3.7.7 (default, Mar 23 2020, 23:19:08) [MSC v.1916 64 bit (AMD64)] python-bits: 64 OS: Windows OS-release: 10 machine: AMD64 processor: Intel64 Family 6 Model 58 Stepping 9, GenuineIntel byteorder: little LC_ALL: None LANG: en LOCALE: None.None libhdf5: 1.10.4 libnetcdf: None xarray: 0.15.0 pandas: 1.0.3 numpy: 1.18.1 scipy: 1.4.1 netCDF4: None pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: None nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.2 dask: 2.13.0 distributed: 2.13.0 matplotlib: 3.1.3 cartopy: None seaborn: 0.10.0 numbagg: None setuptools: 46.1.3.post20200330 pip: 20.0.2 conda: 4.8.3 pytest: 5.4.1 IPython: 7.13.0 sphinx: 2.4.4
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3962/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
963318000 MDExOlB1bGxSZXF1ZXN0NzA1OTU2MDMx 5684 Initialize custom backends in open_dataset Illviljan 14371165 closed 0     4 2021-08-07T23:07:33Z 2021-10-09T23:49:55Z 2021-10-09T23:49:55Z MEMBER   0 pydata/xarray/pulls/5684

The backend classes are initialized in the build_engine function: https://github.com/pydata/xarray/blob/8b95da8e21a9d31de9f79cb0506720595f49e1dd/xarray/backends/plugins.py#L93

This wasn't the case for custom backends: https://github.com/pydata/xarray/blob/8b95da8e21a9d31de9f79cb0506720595f49e1dd/xarray/backends/plugins.py#L161

This PR initializes the engine, fixes the incorrect signature in the test (#5033) and reverts the doc changes done in #5532.

  • [x] Tests added
  • [x] Passes pre-commit run --all-files
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5684/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
925385280 MDExOlB1bGxSZXF1ZXN0NjczODczNjA5 5494 Fix typing in to_stacked_array Illviljan 14371165 closed 0     4 2021-06-19T12:31:26Z 2021-07-02T16:09:07Z 2021-06-24T18:21:03Z MEMBER   0 pydata/xarray/pulls/5494

Attempt to fix these errors found in #5365.

xarray/core/computation.py:1533: error: Argument "variable_dim" to "to_stacked_array" of "Dataset" has incompatible type "Hashable"; expected "str" [arg-type] xarray/core/computation.py:1533: error: Argument "sample_dims" to "to_stacked_array" of "Dataset" has incompatible type "Mapping[Hashable, int]"; expected "Sequence[Hashable]" [arg-type] - [x] Passes pre-commit run --all-files

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5494/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
933817286 MDExOlB1bGxSZXF1ZXN0NjgwOTY4OTI3 5555 Faster interpolation using meta Illviljan 14371165 closed 0     4 2021-06-30T15:13:44Z 2021-07-02T16:05:41Z 2021-07-02T12:49:50Z MEMBER   0 pydata/xarray/pulls/5555

Removing a commented improvement that required that the minimum version requirement for dask to be increased.

  • [x] Requires #5556
  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5555/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
930925407 MDExOlB1bGxSZXF1ZXN0Njc4NTEyNTU5 5541 Faster transpose Illviljan 14371165 closed 0     4 2021-06-27T13:00:21Z 2021-06-27T19:58:14Z 2021-06-27T19:29:58Z MEMBER   0 pydata/xarray/pulls/5541

Make the transpose faster. * get_axis_num seems slow, avoid triggering it unnecessarily for example when using 1d arrays. * .copy(deep=False) is the bottleneck for 1d arrays. Not sure if it can be dealt with though.

  • [ ] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5541/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
891995172 MDExOlB1bGxSZXF1ZXN0NjQ0NzczODU4 5311 Remove version checks for older versions than the min deps Illviljan 14371165 closed 0     4 2021-05-14T15:01:17Z 2021-05-18T18:12:50Z 2021-05-15T04:36:15Z MEMBER   0 pydata/xarray/pulls/5311

Found some checks that should always be true nowadays.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5311/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 1928.826ms · About: xarray-datasette