issues
13 rows where state = "closed", type = "issue" and user = 3958036 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
666523009 | MDU6SXNzdWU2NjY1MjMwMDk= | 4276 | isel with 0d dask array index fails | johnomotani 3958036 | closed | 0 | 0 | 2020-07-27T19:14:22Z | 2023-03-15T02:48:01Z | 2023-03-15T02:48:01Z | CONTRIBUTOR | What happened:
If a 0d dask array is passed as an argument to What you expected to happen:
Minimal Complete Verifiable Example: ```python import dask.array as daskarray import numpy as np import xarray as xr a = daskarray.from_array(np.linspace(0., 1.)) da = xr.DataArray(a, dims="x") x_selector = da.argmax(dim=...) da_max = da.isel(x_selector) ``` Anything else we need to know?: I think the problem is here
https://github.com/pydata/xarray/blob/a198218ddabe557adbb04311b3234ec8d20419e7/xarray/core/variable.py#L546-L548
and May be related to #2511, but from the code snippet above, I think this is a specific issue of 0d dask arrays rather than a generic dask-indexing issue like #2511. I'd like to fix this because it breaks the nice new features of Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-42-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.0.5 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.19.0 distributed: 2.21.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.2.0.post20200712 pip: 20.1.1 conda: 4.8.3 pytest: 5.4.3 IPython: 7.15.0 sphinx: None |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4276/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
698577111 | MDU6SXNzdWU2OTg1NzcxMTE= | 4417 | Inconsistency in whether index is created with new dimension coordinate? | johnomotani 3958036 | closed | 0 | 6 | 2020-09-10T22:44:54Z | 2022-09-13T07:54:32Z | 2022-09-13T07:54:32Z | CONTRIBUTOR | It seems like (1) ``` import numpy as np import xarray as xr ds = xr.Dataset() ds['a'] = ('x', np.linspace(0,1)) ds['b'] = ('x', np.linspace(3,4)) ds = ds.rename(b='x') ds = ds.set_coords('x') print(ds) print('indexes', ds.indexes) ``` (2) ``` import numpy as np import xarray as xr ds = xr.Dataset() ds['a'] = ('x', np.linspace(0,1)) ds['x'] = ('x', np.linspace(3,4)) print(ds) print('indexes', ds.indexes) ``` Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-47-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.1 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.23.0 distributed: 2.25.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.6.0.post20200814 pip: 20.2.3 conda: 4.8.4 pytest: 5.4.3 IPython: 7.15.0 sphinx: 3.2.1 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4417/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
708337538 | MDU6SXNzdWU3MDgzMzc1Mzg= | 4456 | workaround for file with variable and dimension having same name | johnomotani 3958036 | closed | 0 | 4 | 2020-09-24T17:10:04Z | 2021-12-29T16:55:53Z | 2021-12-29T16:55:53Z | CONTRIBUTOR | Adding a variable that's not a 1d "dimension coordinate" with the same name as a dimension is an error. This makes sense. However, if I have a f = netCDF4.Dataset() f = netCDF4.Dataset("test.nc", "w") f.createDimension("x", 2) f.createDimension("y", 3) f["y"] = np.ones([2,3]) f["y"][...] = 1.0 f.close() ds = xr.open_dataset('test.nc')
I think it might be nice to have something like a |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4456/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
842583817 | MDU6SXNzdWU4NDI1ODM4MTc= | 5084 | plot_surface() wrapper | johnomotani 3958036 | closed | 0 | 2 | 2021-03-27T19:16:09Z | 2021-05-03T13:05:02Z | 2021-05-03T13:05:02Z | CONTRIBUTOR | Is there an xarray way to make a surface plot, like matplotlib's |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5084/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
823059252 | MDU6SXNzdWU4MjMwNTkyNTI= | 5002 | Dataset.plot.quiver() docs slightly misleading | johnomotani 3958036 | closed | 0 | 3 | 2021-03-05T12:54:19Z | 2021-05-01T17:38:39Z | 2021-05-01T17:38:39Z | CONTRIBUTOR | In the docs for |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5002/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
847006334 | MDU6SXNzdWU4NDcwMDYzMzQ= | 5097 | 2d plots may fail for some choices of `x` and `y` | johnomotani 3958036 | closed | 0 | 1 | 2021-03-31T17:26:34Z | 2021-04-22T07:16:17Z | 2021-04-22T07:16:17Z | CONTRIBUTOR | What happened:
When making a 2d plot with a 1d What you expected to happen: All three plots in the MCVE should be identical. Minimal Complete Verifiable Example: ```python from matplotlib import pyplot as plt import numpy as np import xarray as xr ds = xr.Dataset({"z": (["x", "y"], np.random.rand(4,4))}) x2d, y2d = np.meshgrid(ds["x"], ds["y"]) ds = ds.assign_coords(x2d=(["x", "y"], x2d.T), y2d=(["x", "y"], y2d.T)) fig, axes = plt.subplots(1,3) h0 = ds["z"].plot.pcolormesh(x="y2d", y="x2d", ax=axes[0]) h1 = ds["z"].plot.pcolormesh(x="y", y="x", ax=axes[1]) h2 = ds["z"].plot.pcolormesh(x="y", y="x2d", ax=axes[2]) plt.show() ``` result:
Anything else we need to know?: The bug is present in both the 0.17.0 release and current I came across this while starting to work on #5084. I think the problem is here
https://github.com/pydata/xarray/blob/ddc352faa6de91f266a1749773d08ae8d6f09683/xarray/plot/plot.py#L678-L684
as the check Why don't we just do something like
Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.8.6 | packaged by conda-forge | (default, Oct 7 2020, 19:08:05) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-70-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.6 libnetcdf: 4.7.4 xarray: 0.17.0 pandas: 1.1.5 numpy: 1.19.4 scipy: 1.5.3 netCDF4: 1.5.5.1 pydap: None h5netcdf: None h5py: 3.1.0 Nio: None zarr: None cftime: 1.3.0 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2020.12.0 distributed: 2020.12.0 matplotlib: 3.3.3 cartopy: None seaborn: None numbagg: None pint: 0.16.1 setuptools: 49.6.0.post20201009 pip: 20.3.3 conda: 4.9.2 pytest: 6.2.1 IPython: 7.19.0 sphinx: 3.4.0 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/5097/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
698263021 | MDU6SXNzdWU2OTgyNjMwMjE= | 4415 | Adding new DataArray to a Dataset removes attrs of existing coord | johnomotani 3958036 | closed | 0 | 3 | 2020-09-10T17:21:32Z | 2020-09-10T17:38:08Z | 2020-09-10T17:38:08Z | CONTRIBUTOR | Minimal Complete Verifiable Example: ``` import numpy as np import xarray as xr ds = xr.Dataset() ds["a"] = xr.DataArray(np.linspace(0., 1.), dims="x") ds["x"] = xr.DataArray(np.linspace(0., 2., len(ds["x"])), dims="x") ds["x"].attrs["foo"] = "bar" print(ds["x"]) ds["b"] = xr.DataArray(np.linspace(0., 1.), dims="x") print(ds["x"]) ``` What happened:
Attribute full output</tt>``` <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 ```What you expected to happen:
Coordinate full expected output</tt>``` <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar <xarray.DataArray 'x' (x: 50)> array([0. , 0.040816, 0.081633, 0.122449, 0.163265, 0.204082, 0.244898, 0.285714, 0.326531, 0.367347, 0.408163, 0.44898 , 0.489796, 0.530612, 0.571429, 0.612245, 0.653061, 0.693878, 0.734694, 0.77551 , 0.816327, 0.857143, 0.897959, 0.938776, 0.979592, 1.020408, 1.061224, 1.102041, 1.142857, 1.183673, 1.22449 , 1.265306, 1.306122, 1.346939, 1.387755, 1.428571, 1.469388, 1.510204, 1.55102 , 1.591837, 1.632653, 1.673469, 1.714286, 1.755102, 1.795918, 1.836735, 1.877551, 1.918367, 1.959184, 2. ]) Coordinates: * x (x) float64 0.0 0.04082 0.08163 0.1224 ... 1.878 1.918 1.959 2.0 Attributes: foo: bar ```Anything else we need to know?: Environment: Output of <tt>xr.show_versions()</tt>INSTALLED VERSIONS ------------------ commit: None python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.0-47-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.4 xarray: 0.16.0 pandas: 1.1.1 numpy: 1.18.5 scipy: 1.4.1 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: 2.10.0 Nio: None zarr: None cftime: 1.2.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.23.0 distributed: 2.25.0 matplotlib: 3.2.2 cartopy: None seaborn: None numbagg: None pint: 0.13 setuptools: 49.6.0.post20200814 pip: 20.2.3 conda: 4.8.4 pytest: 5.4.3 IPython: 7.15.0 sphinx: 3.2.1 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
668515620 | MDU6SXNzdWU2Njg1MTU2MjA= | 4289 | title bar of docs displays incorrect version | johnomotani 3958036 | closed | 0 | 5 | 2020-07-30T09:00:43Z | 2020-08-18T22:32:51Z | 2020-08-18T22:32:51Z | CONTRIBUTOR | What happened:
The browser title bar displays an incorrect version when viewing the docs online. See below - title bar says 0.15.1 but actual version in URL is 0.16.0.
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/4289/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
595882590 | MDU6SXNzdWU1OTU4ODI1OTA= | 3948 | Releasing memory? | johnomotani 3958036 | closed | 0 | 6 | 2020-04-07T13:49:07Z | 2020-04-07T14:18:36Z | 2020-04-07T14:18:36Z | CONTRIBUTOR | Once For example, what would be the best workflow for this case: I have several large arrays on disk. Each will fit into memory individually. I want to do some analysis on each array (which produces small results), and keep the results in memory, but I do not need the large arrays any more after the analysis. I'm wondering if some sort of da2 = ds["variable2"] result2 = do_some_work(da2) # may load large parts of da2 into memory da2.release() # any changes to da2 not already saved to disk are lost, but do not want da1 any more ... etc. ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3948/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
583220835 | MDU6SXNzdWU1ODMyMjA4MzU= | 3866 | Allow `isel` to ignore missing dimensions? | johnomotani 3958036 | closed | 0 | 0 | 2020-03-17T18:41:13Z | 2020-04-03T19:47:08Z | 2020-04-03T19:47:08Z | CONTRIBUTOR | Sometimes it would be nice for ds.isel(t=0) # currently raises an exception ds.isel(t=0, ignore_missing=True) # would be nice if this was allowed, just returning ds ``` For example, when writing a function can be called on variables with different combinations of dimensions. I think it should be fairly easy to implement, just add the argument to the condition here https://github.com/pydata/xarray/blob/65a5bff79479c4b56d6f733236fe544b7f4120a8/xarray/core/variable.py#L1059-L1062 the only downside would be increased complexity of adding another argument to the API for an issue where a workaround is not hard (at least in the case I have at the moment), just a bit clumsy. |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3866/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
583080947 | MDU6SXNzdWU1ODMwODA5NDc= | 3865 | `merge` drops attributes | johnomotani 3958036 | closed | 0 | 1 | 2020-03-17T15:06:18Z | 2020-03-24T20:40:18Z | 2020-03-24T20:40:18Z | CONTRIBUTOR |
MCVE Code Sample```python Your code hereimport xarray as xr ds1 = xr.Dataset() ds1.attrs['a'] = 42 ds2 = xr.Dataset() ds2.attrs['a'] = 42 merged = xr.merge([ds1, ds2]) print(merged)
Expected Output
Problem DescriptionThe current behaviour means I have to check and copy I'm happy to attempt a PR to fix this.
Proposal (following pattern of This proposal should also allow VersionsCurrent Output of `xr.show_versions()`INSTALLED VERSIONS ------------------ commit: None python: 3.6.9 (default, Nov 7 2019, 10:44:02) [GCC 8.3.0] python-bits: 64 OS: Linux OS-release: 5.3.0-40-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_GB.UTF-8 LOCALE: en_GB.UTF-8 libhdf5: 1.10.2 libnetcdf: 4.6.3 xarray: 0.15.0 pandas: 1.0.2 numpy: 1.18.1 scipy: 1.3.0 netCDF4: 1.5.1.2 pydap: None h5netcdf: None h5py: 2.9.0 Nio: None zarr: None cftime: 1.0.3.4 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: None dask: 2.12.0 distributed: None matplotlib: 3.1.1 cartopy: None seaborn: None numbagg: None setuptools: 45.2.0 pip: 9.0.1 conda: None pytest: 4.4.1 IPython: 7.8.0 sphinx: 1.8.3 |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3865/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
532165408 | MDU6SXNzdWU1MzIxNjU0MDg= | 3590 | cmap.set_under() does not work as expected | johnomotani 3958036 | closed | 0 | 5 | 2019-12-03T18:04:07Z | 2020-02-24T20:20:07Z | 2020-02-24T20:20:07Z | CONTRIBUTOR | When using matplotlib, the dat = numpy.linspace(0, 1)[numpy.newaxis, :]*numpy.linspace(0, 1)[:, numpy.newaxis] cmap = matplotlib.cm.viridis cmap.set_under('w')pyplot.contourf(dat, vmin=.3, cmap=cmap)
pyplot.colorbar()
pyplot.show()
```
produces
while uncommenting the However, using da = DataArray(numpy.linspace(0, 1)[numpy.newaxis, :]*numpy.linspace(0, 1)[:, numpy.newaxis]) cmap = matplotlib.cm.viridis cmap.set_under('w') da.plot.contourf(vmin=.3, cmap=cmap)
pyplot.show()
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3590/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
507468507 | MDU6SXNzdWU1MDc0Njg1MDc= | 3401 | _get_scheduler() exception if dask.multiprocessing missing | johnomotani 3958036 | closed | 0 | 0 | 2019-10-15T20:35:14Z | 2019-10-21T00:17:48Z | 2019-10-21T00:17:48Z | CONTRIBUTOR | These lines were recently changed in #3358 https://github.com/pydata/xarray/blob/3f9069ba376afa35c0ca83b09a6126dd24cb8127/xarray/backends/locks.py#L87-L92 If the 'cloudpickle' package is not installed, then Suggest either reverting the changes that removed the To reproduce: 1. check 'cloudpickle' is not installed, but 'dask' is 2. execute the following commands ```
AttributeError Traceback (most recent call last) <ipython-input-2-20da238796b7> in <module> ----> 1 xarray.backends.api._get_scheduler() ~/.local/lib/python3.6/site-packages/xarray/backends/locks.py in _get_scheduler(get, collection) 87 pass 88 ---> 89 if actual_get is dask.multiprocessing.get: 90 return "multiprocessing" 91 else: AttributeError: module 'dask' has no attribute 'multiprocessing' ``` |
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3401/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);