home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

19 rows where comments = 4 and user = 10194086 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: closed_at, draft, created_at (date), updated_at (date), closed_at (date)

type 2

  • pull 10
  • issue 9

state 2

  • closed 18
  • open 1

repo 1

  • xarray 19
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
2163675672 PR_kwDOAMm_X85obI_8 8803 missing chunkmanager: update error message mathause 10194086 open 0     4 2024-03-01T15:48:00Z 2024-03-15T11:02:45Z   MEMBER   0 pydata/xarray/pulls/8803

When dask is missing we get the following error message:

python-traceback ValueError: unrecognized chunk manager dask - must be one of: []

this could be confusing - the error message seems geared towards a typo in the requested manager. However, I think it's much more likely that a chunk manager is just not installed. I tried to update the error message - happy to get feedback.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8803/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2090265314 PR_kwDOAMm_X85kiCi8 8627 unify freq strings (independent of pd version) mathause 10194086 closed 0     4 2024-01-19T10:57:04Z 2024-02-15T17:53:42Z 2024-02-15T16:53:36Z MEMBER   0 pydata/xarray/pulls/8627
  • [ ] Adresses points 2 and 3 and closes #8612
  • [ ] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst

Probably not ready for review yet.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8627/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
2025652693 PR_kwDOAMm_X85hJh0D 8521 test and fix empty xindexes repr mathause 10194086 closed 0     4 2023-12-05T08:54:56Z 2024-01-08T10:58:09Z 2023-12-06T17:06:15Z MEMBER   0 pydata/xarray/pulls/8521
  • [x] Closes #8367
  • [x] Tests added
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst

Uses max with a default, which work with empty iterators, in contrast to if col_items else 0.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/8521/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
144630996 MDU6SXNzdWUxNDQ2MzA5OTY= 810 correct DJF mean mathause 10194086 closed 0     4 2016-03-30T15:36:42Z 2022-04-06T16:19:47Z 2016-05-04T12:56:30Z MEMBER      

This started as a question and I add it as reference. Maybe you have a comment.

There are several ways to calculate time series of seasonal data (starting from monthly or daily data):

```

load libraries

import pandas as pd import matplotlib.pyplot import numpy as np import xarray as xr

Create Example Dataset

time = pd.date_range('2000.01.01', '2010.12.31', freq='M') data = np.random.rand(*time.shape) ds = xr.DataArray(data, coords=dict(time=time))

(1) using resample

ds_res = ds.resample('Q-FEB', 'time') ds_res = ds_res.sel(time=ds_res['time.month'] == 2) ds_res = ds_res.groupby('time.year').mean('time')

(2) this is wrong

ds_season = ds.where(ds['time.season'] == 'DJF').groupby('time.year').mean('time')

(3) using where and rolling

mask other months with nan

ds_DJF = ds.where(ds['time.season'] == 'DJF')

rolling mean -> only Jan is not nan

however, we loose Jan/ Feb in the first year and Dec in the last

ds_DJF = ds_DJF.rolling(min_periods=3, center=True, time=3).mean()

make annual mean

ds_DJF = ds_DJF.groupby('time.year').mean('time')

ds_res.plot(marker='*') ds_season.plot() ds_DJF.plot()

plt.show() ```

(1) The first is to use resample with 'Q-FEB' as argument. This works fine. It does include Jan/ Feb in the first year, and Dec in the last year + 1. If this makes sense can be debated. One case where this does not work is when you have, say, two regions in your data set, for one you want to calculate DJF and for the other you want NovDecJan.

(2) Using 'time.season' is wrong as it combines Jan, Feb and Dec from the same year.

(3) The third uses where and rolling and you lose 'incomplete' seasons. If you replace ds.where(ds['time.season'] == 'DJF') with ds.groupby('time.month').where(summer_months), where summer_months is a boolean array it works also for non-standard 'summers' (or seasons) across the globe.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/810/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1126086052 PR_kwDOAMm_X84yLQ48 6251 use `warnings.catch_warnings(record=True)` instead of `pytest.warns(None)` mathause 10194086 closed 0     4 2022-02-07T14:42:26Z 2022-02-18T16:51:58Z 2022-02-18T16:51:55Z MEMBER   0 pydata/xarray/pulls/6251

pytest v7.0.0 no longer want's us to use pytest.warns(None) to test for no warning, so we can use warnings.catch_warnings(record=True) instead.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6251/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
752870062 MDExOlB1bGxSZXF1ZXN0NTI5MDc4NDA0 4616 don't type check __getattr__ mathause 10194086 closed 0     4 2020-11-29T08:53:09Z 2022-01-26T08:41:18Z 2021-10-18T14:06:30Z MEMBER   1 pydata/xarray/pulls/4616
  • [x] Closes #4601
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

It's not pretty as I had to define a number of empty methods... I think this should wait for 0.17

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4616/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
802400938 MDExOlB1bGxSZXF1ZXN0NTY4NTUwNDEx 4865 fix da.pad example for numpy 1.20 mathause 10194086 closed 0     4 2021-02-05T19:00:04Z 2021-10-18T14:06:33Z 2021-02-07T21:57:34Z MEMBER   0 pydata/xarray/pulls/4865
  • [x] Closes #4858
  • [x] Passes pre-commit run --all-files
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4865/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
800118528 MDU6SXNzdWU4MDAxMTg1Mjg= 4858 doctest failure with numpy 1.20 mathause 10194086 closed 0     4 2021-02-03T08:57:43Z 2021-02-07T21:57:34Z 2021-02-07T21:57:34Z MEMBER      

What happened:

Our doctests fail since numpy 1.20 came out:

https://github.com/pydata/xarray/pull/4760/checks?check_run_id=1818512841#step:8:69

What you expected to happen:

They don't ;-)

Minimal Complete Verifiable Example:

The following fails with numpy 1.20 while it converted np.NaN to an integer before (xarray.DataArray.pad at the bottom)

```python import numpy as np

x = np.arange(10) x = np.pad(x, 1, "constant", constant_values=np.nan) ``` requires numpy 1.20

Anything else we need to know?:

  • that's probably related to https://numpy.org/doc/stable/release/1.20.0-notes.html#numpy-scalars-are-cast-when-assigned-to-arrays
  • I asked if this behavior will stay: https://github.com/numpy/numpy/issues/16499#issuecomment-772342087
  • One possibility is to add a check np.can_cast(constant_values.dtype, array.dtype) (or similar) for a better error message.
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4858/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
688115687 MDU6SXNzdWU2ODgxMTU2ODc= 4385 warnings from internal use of apply_ufunc mathause 10194086 closed 0     4 2020-08-28T14:28:56Z 2020-08-30T16:37:52Z 2020-08-30T16:37:52Z MEMBER      

Another follow up from #4060: quantile now emits a FutureWarning:

Minimal Complete Verifiable Example:

python xr.DataArray([1, 2, 3]).quantile(q=0.5)

~/.conda/envs/ipcc_ar6/lib/python3.7/site-packages/xarray/core/variable.py:1866: FutureWarning: ``output_sizes`` should be given in the ``dask_gufunc_kwargs`` parameter. It will be removed as direct parameter in a future version. kwargs={"q": q, "axis": axis, "interpolation": interpolation},

We should probably check the warnings in the test suite - there may be others.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4385/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
683856183 MDExOlB1bGxSZXF1ZXN0NDcxODc4NjUz 4365 Silence plot warnings mathause 10194086 closed 0     4 2020-08-21T22:21:40Z 2020-08-24T16:05:13Z 2020-08-24T16:00:42Z MEMBER   0 pydata/xarray/pulls/4365
  • [x] Towards #3266
  • [x] Tests added
  • [x] Passes isort . && black . && mypy . && flake8
  • [x] User visible changes (including notable bug fixes) are documented in whats-new.rst

I gave a try to silence some of the warning for the plotting functions. I brought it down from 67 to 5 (4 of which come from external libraries).

  • [x] MatplotlibDeprecationWarning: The 'extend' parameter to Colorbar has no effect because it is overridden by the mappable; it is deprecated since 3.3 and will be removed two minor releases later.
  • [x] MatplotlibDeprecationWarning: You are modifying the state of a globally registered colormap. In future versions, you will not be able to modify a registered colormap in-place. To remove this warning, you can make a copy of the colormap first. cmap = copy.copy(mpl.cm.get_cmap("viridis"))
  • [x] MatplotlibDeprecationWarning: Passing parameters norm and vmin/vmax simultaneously is deprecated since 3.3 and will become an error two minor releases later. Please pass vmin/vmax directly to the norm when creating it.
  • [ ] MatplotlibDeprecationWarning: shading='flat' when X and Y have the same dimensions as C is deprecated since 3.3. Either specify the corners of the quadrilaterals with X and Y, or pass shading='auto', 'nearest' or 'gouraud', or set rcParams['pcolor.shading']. This will become an error two minor releases later. See #4364
  • [x] UserWarning: Requested projection is different from current axis projection, creating new axis with requested projection.
  • [x] Made sure all figures are closed at the end of a test.
  • [x] Added a meta-test to ensure all figures will be closed if tests are added in the future

I cannot exclude that one of these changes has an effect on the plots that is not tested in the suite... Tests pass locally for py38 and py36-bare-minimum.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4365/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
514308007 MDExOlB1bGxSZXF1ZXN0MzMzOTQ4MDg2 3463 unpin cftime mathause 10194086 closed 0     4 2019-10-30T00:05:55Z 2020-08-19T13:11:55Z 2019-10-30T01:08:14Z MEMBER   0 pydata/xarray/pulls/3463

I think the cftime problems should be fixed after the release of v1.0.4.2

#3434

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3463/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
620424728 MDExOlB1bGxSZXF1ZXN0NDE5Njc1NDU1 4075 Fix bool weights mathause 10194086 closed 0     4 2020-05-18T18:42:05Z 2020-08-19T13:11:40Z 2020-05-23T21:06:19Z MEMBER   0 pydata/xarray/pulls/4075
  • [x] Closes #4074
  • [x] Tests added
  • [x] Passes isort -rc . && black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4075/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
559864146 MDU6SXNzdWU1NTk4NjQxNDY= 3750 isort pre-commit hook does not skip text files mathause 10194086 closed 0     4 2020-02-04T17:18:31Z 2020-05-06T01:50:29Z 2020-03-28T20:58:15Z MEMBER      

MCVE Code Sample

Add arbitrary change to the file doc/pandas.rst

bash git add doc/pandas.rst git commit -m "test"

The pre-commit hook will fail.

Expected Output

the pre-commit hook to pass

Problem Description

running isort -rc doc/* will change the following files: bash modified: contributing.rst modified: howdoi.rst modified: internals.rst modified: io.rst modified: pandas.rst modified: quick-overview.rst unfortunately it does not behave properly and deletes/ changes arbitrary lines. Can the pre-commit hook be told to only run on *.py files? On the command line this would be isort -rc *.py

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3750/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
550964139 MDExOlB1bGxSZXF1ZXN0MzYzNzcyNzE3 3699 Feature/align in dot mathause 10194086 closed 0     4 2020-01-16T17:55:38Z 2020-01-20T12:55:51Z 2020-01-20T12:09:27Z MEMBER   0 pydata/xarray/pulls/3699
  • [x] Closes #3694
  • [x] Tests added
  • [x] Passes black . && mypy . && flake8
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API

Happy to get feedback @fujiisoup @shoyer

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3699/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
545764524 MDU6SXNzdWU1NDU3NjQ1MjQ= 3665 Cannot roundtrip time in NETCDF4_CLASSIC mathause 10194086 closed 0     4 2020-01-06T14:47:48Z 2020-01-16T18:27:15Z 2020-01-16T18:27:14Z MEMBER      

MCVE Code Sample

``` python import numpy as np import xarray as xr time = xr.cftime_range("2006-01-01", periods=2, calendar="360_day")

da = xr.DataArray(time, dims=["time"]) da.encoding["dtype"] = np.float da.to_netcdf("tst.nc", format="NETCDF4_CLASSIC")

ds = xr.open_dataset("tst.nc") ds.to_netcdf("tst2.nc", format="NETCDF4_CLASSIC") ```

yields: python ValueError: could not safely cast array from dtype int64 to int32

Or an example without to_netcdf:

```python import numpy as np import xarray as xr

time = xr.cftime_range("2006-01-01", periods=2, calendar="360_day")

da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan])

xr.backends.netcdf3.encode_nc3_variable(xr.conventions.encode_cf_variable(da)) ```

Expected Output

Xarray can save the dataset/ an xr.Variable.

Problem Description

If there is a time variable that can be encoded using integers only, but that has a _FillValue set to NaN, saving to_netcdf(name, format="NETCDF4_CLASSIC") fails. The problem is that xarray adds a (unnecessary) _FillValue when saving a file.

Note: if the time cannot be encoded using integers only, it works: ``` python da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan]) da.encoding["units"] = "days since 2006-01-01T12:00:00"

xr.backends.netcdf3.encode_nc3_variable(xr.conventions.encode_cf_variable(da))

```

Another note: when saving with NETCDF4 ``` python da = xr.DataArray(time, dims=["time"]) da.encoding["_FillValue"] = np.array([np.nan])

xr.backends.netCDF4_._encode_nc4_variable(xr.conventions.encode_cf_variable(da)) The following is returned: <xarray.Variable (time: 2)> array([0, 1]) Attributes: units: days since 2006-01-01 00:00:00.000000 calendar: proleptic_gregorian _FillValue: [-9223372036854775808] ```

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.12.14-lp151.28.36-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.1 pandas: 0.25.2 numpy: 1.17.3 scipy: 1.3.1 netCDF4: 1.5.3 pydap: None h5netcdf: 0.7.4 h5py: 2.10.0 Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: 1.2.0 PseudoNetCDF: None rasterio: 1.1.1 cfgrib: None iris: None bottleneck: 1.3.1 dask: 2.6.0 distributed: 2.6.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: 0.9.0 numbagg: None setuptools: 41.4.0 pip: 19.3.1 conda: None pytest: 5.2.2 IPython: 7.9.0 sphinx: 2.2.1
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3665/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
106595746 MDU6SXNzdWUxMDY1OTU3NDY= 577 wrap lon coordinates to 360 mathause 10194086 closed 0     4 2015-09-15T16:36:37Z 2019-01-17T09:34:56Z 2019-01-15T20:15:01Z MEMBER      

Assume I have two datasets with the same lat/ lon grid. However, one has lon = 0...359and the other lon = -180...179. How can I wrap around one of them?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/577/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
310819233 MDU6SXNzdWUzMTA4MTkyMzM= 2036 better error message for to_netcdf -> unlimited_dims mathause 10194086 closed 0     4 2018-04-03T12:39:21Z 2018-05-18T14:48:32Z 2018-05-18T14:48:32Z MEMBER      

Code Sample, a copy-pastable example if possible

```python

Your code here

import numpy as np import xarray as xr x = np.arange(10) da = xr.Dataset(data_vars=dict(data=('dim1', x)), coords=dict(dim1=('dim1', x), dim2=('dim2', x))) da.to_netcdf('tst.nc', format='NETCDF4_CLASSIC', unlimited_dims='dim1')

```

Problem description

This creates the error RuntimeError: NetCDF: NC_UNLIMITED size already in use. With format='NETCDF4' silently creates the dimensions d, i, m, and \1.

The correct syntax is unlimited_dims=['dim1'].

With format='NETCDF4_CLASSIC' and unlimited_dims=['dim1', 'dim2'], still raises the not-so-helpful NC_UNLIMITED error.

I only tested with netCDF4 as backend.

Expected Output

  • better error message
  • work with unlimited_dims='dim1'

Output of xr.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Linux OS-release: 4.4.120-45-default machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 xarray: 0.10.2 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.1 netCDF4: 1.3.1 h5netcdf: 0.5.0 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: 1.0.0 dask: 0.17.2 distributed: 1.21.5 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: 0.8.1 setuptools: 39.0.1 pip: 9.0.3 conda: None pytest: 3.5.0 IPython: 6.3.0 sphinx: 1.7.2
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2036/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
106581329 MDU6SXNzdWUxMDY1ODEzMjk= 576 define fill value for where mathause 10194086 closed 0     4 2015-09-15T15:27:32Z 2017-08-08T17:00:30Z 2017-08-08T17:00:30Z MEMBER      

It would be nice if where accepts an other argument:

def where(self, cond, other=np.NaN): pass

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/576/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
67332234 MDU6SXNzdWU2NzMzMjIzNA== 386 "loosing" virtual variables mathause 10194086 closed 0     4 2015-04-09T10:35:31Z 2015-04-20T03:55:44Z 2015-04-20T03:55:44Z MEMBER      

Once I take a mean over virtual variables, they are not available any more.

import pandas as pd import numpy as np import xray t = pd.date_range('2000-01-01', '2000-12-31', freq='6H') x = np.random.rand(*t.shape) time = xray.DataArray(t, name='t', dims='time') ts = xray.Dataset({'x' : ('time', x), 'time' : time}) ts_mean = ts.groupby('time.date').mean() ts_mean.virtual_variables

Is this intended behaviour? And could I get them back somehow?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/386/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 30.996ms · About: xarray-datasette