home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

14 rows where user = 9155111 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 7

  • Add optional dependencies 3
  • Dask error when importing pip installed xarray. 3
  • Backend / plugin system `remove_duplicates` raises AttributeError on discovering duplicates 3
  • Performance: numpy indexes small amounts of data 1000 faster than xarray 2
  • Regression: 3rd party backends are not discovered with `xarray==0.20.0` 1
  • fix the detection of backend entrypoints 1
  • Warn and raise ImportError, if a DuckArrayModule is corrupted 1

user 1

  • ashwinvis · 14 ✖

author_association 1

  • CONTRIBUTOR 14
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
990886926 https://github.com/pydata/xarray/pull/6039#issuecomment-990886926 https://api.github.com/repos/pydata/xarray/issues/6039 IC_kwDOAMm_X847D7wO ashwinvis 9155111 2021-12-10T11:22:02Z 2021-12-10T11:22:02Z CONTRIBUTOR

For some context, without this fix, there would be strange errors - not only for developers of xarray, but also for developers of third-party libraries which has a import xarray somewhere in the code.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Warn and raise ImportError, if a DuckArrayModule is corrupted 1070532676
985473412 https://github.com/pydata/xarray/issues/5841#issuecomment-985473412 https://api.github.com/repos/pydata/xarray/issues/5841 IC_kwDOAMm_X846vSGE ashwinvis 9155111 2021-12-03T12:17:51Z 2021-12-03T12:33:59Z CONTRIBUTOR

See also #5236 which could be related which uses conda.

I tried to scour through pip and pytest issues, but I can't find a discussion on this. pip and pytest and pycacheare too generic keywords. I even found pytest plugin to remove bytecode but it does not work ~~anymore~~ for me.

I proposed a fix in #6039, which I believe is good to have, as you can't expect every user to execute with python -B or set PYTHONDONTWRITEBYTECODE while running tests.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dask error when importing pip installed xarray. 1019478260
985386101 https://github.com/pydata/xarray/issues/5841#issuecomment-985386101 https://api.github.com/repos/pydata/xarray/issues/5841 IC_kwDOAMm_X846u8x1 ashwinvis 9155111 2021-12-03T10:02:51Z 2021-12-03T10:02:51Z CONTRIBUTOR

This in my opinion is a cascade of multiple issues: - pytest creates special bytecode - pip does not clean the whole __pycache__ directory on uninstall - xarray.pycompat.is_duck_dask_array uses importlib.import_module('dask') to determine if it is installed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dask error when importing pip installed xarray. 1019478260
985379668 https://github.com/pydata/xarray/issues/5841#issuecomment-985379668 https://api.github.com/repos/pydata/xarray/issues/5841 IC_kwDOAMm_X846u7NU ashwinvis 9155111 2021-12-03T09:53:46Z 2021-12-03T09:53:46Z CONTRIBUTOR

@twhughes @keewis In my case, I encounter this when I: - Have xarray + dask installed in my environment - Run any pytest which imports xarray (which in turn imports dask) - Uninstall dask, but pytest artefacts remain:

sh ❯ ls venv/lib/python3.9/site-packages/dask/__pycache__ utils_test.cpython-39-pytest-6.2.5.pyc

Here are the steps which would hopefully reproduce the behaviour:

```py

Create this small unit test named:

test_import_xarray.py

def test_xr(): import xarray ```

```sh python -m venv venv source venv/bin/activate pip install xarray pytest dask pytest test_import_xarray.py

pip uninstall dask ls venv/lib/python3.9/site-packages/dask/pycache # See output above python -c 'import dask' # No ImportError! python -c 'import xarray' ```

Exact versions used Python 3.9.9 along with: ```sh

requirements.txt

attrs==21.2.0 cloudpickle==2.0.0 dask==2021.11.2 fsspec==2021.11.1 iniconfig==1.1.1 locket==0.2.1 numpy==1.21.4 packaging==21.3 pandas==1.3.4 partd==1.2.0 pluggy==1.0.0 py==1.11.0 pyparsing==3.0.6 pytest==6.2.5 python-dateutil==2.8.2 pytz==2021.3 PyYAML==6.0 six==1.16.0 toml==0.10.2 toolz==0.11.2 xarray==0.20.1 ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Dask error when importing pip installed xarray. 1019478260
963006086 https://github.com/pydata/xarray/issues/5944#issuecomment-963006086 https://api.github.com/repos/pydata/xarray/issues/5944 IC_kwDOAMm_X845Zk6G ashwinvis 9155111 2021-11-08T10:19:37Z 2021-11-08T10:19:37Z CONTRIBUTOR

To use the select method, the following should be changed:

  • Swap the imports so that newer importlib_metadata is imported first: https://github.com/pydata/xarray/blob/e0deb9cf0a5cd5c9e3db033fd13f075added9c1e/xarray/backends/plugins.py#L8-L12
  • Require importlib_metadata for Python < 3.10 https://github.com/pydata/xarray/blob/e0deb9cf0a5cd5c9e3db033fd13f075added9c1e/setup.cfg#L81
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Backend / plugin system `remove_duplicates` raises AttributeError on discovering duplicates 1046454702
963003300 https://github.com/pydata/xarray/issues/5944#issuecomment-963003300 https://api.github.com/repos/pydata/xarray/issues/5944 IC_kwDOAMm_X845ZkOk ashwinvis 9155111 2021-11-08T10:16:35Z 2021-11-08T10:16:35Z CONTRIBUTOR

@kmuehlbauer

  • Add a breakpoint() below line 102 and print out entrypoints. Step into build_engines follow it using the debugger.
  • Can you also try this suggestion https://github.com/pydata/xarray/issues/5944#issuecomment-962414054
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Backend / plugin system `remove_duplicates` raises AttributeError on discovering duplicates 1046454702
962414054 https://github.com/pydata/xarray/issues/5944#issuecomment-962414054 https://api.github.com/repos/pydata/xarray/issues/5944 IC_kwDOAMm_X845XUXm ashwinvis 9155111 2021-11-06T07:50:52Z 2021-11-06T07:50:52Z CONTRIBUTOR

On a side note, the syntax .get is deprecated in the importlib_metadata package and most likely in Python 3.10's importlib.metadata stdlib.

```py

In [16]: from importlib_metadata import entry_points

In [17]: entry_points().get('xarray.backends', ()) <ipython-input-17-5f3ea0df5c10>:1: DeprecationWarning: SelectableGroups dict interface is deprecated. Use select. entry_points().get('xarray.backends', ()) Out[17]: [EntryPoint(name='rasterio', value='rioxarray.xarray_plugin:RasterioBackend', group='xarray.backends'), EntryPoint(name='pymech', value='pymech.dataset:PymechXarrayBackend', group='xarray.backends')]

In [18]: entry_points().select(group='xarray.backends') Out[18]: [EntryPoint(name='rasterio', value='rioxarray.xarray_plugin:RasterioBackend', group='xarray.backends'), EntryPoint(name='pymech', value='pymech.dataset:PymechXarrayBackend', group='xarray.backends')] ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Backend / plugin system `remove_duplicates` raises AttributeError on discovering duplicates 1046454702
959131968 https://github.com/pydata/xarray/pull/5931#issuecomment-959131968 https://api.github.com/repos/pydata/xarray/issues/5931 IC_kwDOAMm_X845KzFA ashwinvis 9155111 2021-11-03T13:52:26Z 2021-11-03T13:54:22Z CONTRIBUTOR

Or add a small dummy package to be installed using pip while testing. Something like the following under the ci/requirements/*.yml files should do it.

yml - pip: - ./xarray_test_package

with the following contents at ci/requirements at the bare minimum ```ini ❯ cat xarray_test_package/pyproject.toml [build-system] requires = ["setuptools", "wheel"] build-backend = "setuptools.build_meta" ❯ cat xarray_test_package/setup.cfg [metadata] name = xarray_test_package version = 0.0.0

[options] packages = find:

[options.entry_points] xarray.backends = xarray_test_package_backend = xarray_test_package.plugin:XarrayTestPackageBackend ```

This can later be extended with a minimal xarray_test_package.plugin module if needed.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  fix the detection of backend entrypoints 1043378880
958787822 https://github.com/pydata/xarray/issues/5930#issuecomment-958787822 https://api.github.com/repos/pydata/xarray/issues/5930 IC_kwDOAMm_X845JfDu ashwinvis 9155111 2021-11-03T09:43:29Z 2021-11-03T09:43:29Z CONTRIBUTOR

In the same example 1st party backends such as h5netcdf work with both versions.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Regression: 3rd party backends are not discovered with `xarray==0.20.0` 1043276928
703172986 https://github.com/pydata/xarray/pull/4480#issuecomment-703172986 https://api.github.com/repos/pydata/xarray/issues/4480 MDEyOklzc3VlQ29tbWVudDcwMzE3Mjk4Ng== ashwinvis 9155111 2020-10-03T22:36:39Z 2020-10-03T22:36:39Z CONTRIBUTOR

The only hiccup that I noticed was scitools-iris requires cartopy (which would install only if numpy is pre-installed), which in turn requires cf-units (which again would install if UDUNITS-2 library is present). I had gdal installed in my machine so I did not notice this. In short, it is best to avoid / mock scitools+ stuff if it can be avoided. I added it as a doc dependency because it is mandatory to have scitools+ packages - the way it is configured now. There is certainly room for future improvement.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add optional dependencies 713606589
703138718 https://github.com/pydata/xarray/pull/4480#issuecomment-703138718 https://api.github.com/repos/pydata/xarray/issues/4480 MDEyOklzc3VlQ29tbWVudDcwMzEzODcxOA== ashwinvis 9155111 2020-10-03T17:40:27Z 2020-10-03T17:40:27Z CONTRIBUTOR

Iris on PyPI is a dummy package. The package relevant for us is scitools-iris which can be difficult to install. That and some other packages should be installed to generate the docs. Ideally these should be mocked, maybe for another issue.

I tracked down all the dependencies required for docs. Could be useful for development.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add optional dependencies 713606589
702808164 https://github.com/pydata/xarray/pull/4480#issuecomment-702808164 https://api.github.com/repos/pydata/xarray/issues/4480 MDEyOklzc3VlQ29tbWVudDcwMjgwODE2NA== ashwinvis 9155111 2020-10-02T15:44:00Z 2020-10-02T15:44:00Z CONTRIBUTOR

@mathause Something similar to https://docs.dask.org/en/latest/install.html#pip ?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Add optional dependencies 713606589
539352070 https://github.com/pydata/xarray/issues/2799#issuecomment-539352070 https://api.github.com/repos/pydata/xarray/issues/2799 MDEyOklzc3VlQ29tbWVudDUzOTM1MjA3MA== ashwinvis 9155111 2019-10-08T06:08:27Z 2019-10-08T06:08:48Z CONTRIBUTOR

I suspect system jitter in the profiling as the time for Dataset.isel went up. It would be useful to run sudo python -m pyperf system tune before running profiler/benchmarks.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Performance: numpy indexes small amounts of data 1000 faster than xarray 416962458
538366978 https://github.com/pydata/xarray/issues/2799#issuecomment-538366978 https://api.github.com/repos/pydata/xarray/issues/2799 MDEyOklzc3VlQ29tbWVudDUzODM2Njk3OA== ashwinvis 9155111 2019-10-04T11:57:10Z 2019-10-04T11:57:10Z CONTRIBUTOR

At first sight it looks somewhat like a hybrid between Cython (for the ahead-of-time transpiling to C++) and numba (for having python-compatible syntax).

Not really. Pythran always releases the GIL and does a bunch of optimizations between transpilation and compilations.

A good approach would be try out different compilers and see what performance is obtained, without losing readability (https://github.com/pydata/xarray/issues/2799#issuecomment-469444519). See scikit-image/scikit-image/issues/4199 where the package transonic was being experimentally tested to replace Cython-only code with python code + type hints. As a bonus, you get to switch between Cython, Pythran and Numba,

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Performance: numpy indexes small amounts of data 1000 faster than xarray 416962458

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 17.244ms · About: xarray-datasette