home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

3 rows where user = 4179064 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 2
  • open 1

type 1

  • issue 3

repo 1

  • xarray 3
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
314764258 MDU6SXNzdWUzMTQ3NjQyNTg= 2064 concat_dim getting added to *all* variables of multifile datasets xylar 4179064 open 0     31 2018-04-16T18:16:25Z 2022-07-16T14:11:42Z   NONE      

Code Sample

Using the following example data set: example_jan.nc ```python

!/usr/bin/env python3

import xarray ds = xarray.open_mfdataset('example_jan.nc', concat_dim='Time') print(ds) The result from xarray 0.10.2 (and all previous various xarray versions we've worked with): Dimensions: (Time: 1, nOceanRegions: 7, nOceanRegionsTmp: 7, nVertLevels: 100) Dimensions without coordinates: Time, nOceanRegions, nOceanRegionsTmp, nVertLevels Data variables: time_avg_avgValueWithinOceanLayerRegion_avgLayerTemperature (Time, nOceanRegionsTmp, nVertLevels) float64 dask.array<shape=(1, 7, 100), chunksize=(1, 7, 100)> time_avg_avgValueWithinOceanRegion_avgSurfaceTemperature (Time, nOceanRegions) float64 dask.array<shape=(1, 7), chunksize=(1, 7)> time_avg_daysSinceStartOfSim (Time) timedelta64[ns] dask.array<shape=(1,), chunksize=(1,)> xtime_end (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> xtime_start (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> refBottomDepth (nVertLevels) float64 dask.array<shape=(100,), chunksize=(100,)> Attributes: history: Tue Dec 6 04:49:14 2016: ncatted -O -a ,global,d,, acme_alaph7... NCO: "4.6.2" The results with xarray 0.10.3: <xarray.Dataset> Dimensions: (Time: 1, nOceanRegions: 7, nOceanRegionsTmp: 7, nVertLevels: 100) Dimensions without coordinates: Time, nOceanRegions, nOceanRegionsTmp, nVertLevels Data variables: time_avg_avgValueWithinOceanLayerRegion_avgLayerTemperature (Time, nOceanRegionsTmp, nVertLevels) float64 dask.array<shape=(1, 7, 100), chunksize=(1, 7, 100)> time_avg_avgValueWithinOceanRegion_avgSurfaceTemperature (Time, nOceanRegions) float64 dask.array<shape=(1, 7), chunksize=(1, 7)> time_avg_daysSinceStartOfSim (Time) timedelta64[ns] dask.array<shape=(1,), chunksize=(1,)> xtime_end (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> xtime_start (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> refBottomDepth (Time, nVertLevels) float64 dask.array<shape=(1, 100), chunksize=(1, 100)> Attributes: history: Tue Dec 6 04:49:14 2016: ncatted -O -a ,global,d,, acme_alaph7... NCO: "4.6.2" ```

Problem description

The expected behavior for us was that refBottomDepth should not have Time as a dimension. It does not vary with time and does not have a Time dimension in the input data set.

It seems like https://github.com/pydata/xarray/issues/1988 and https://github.com/pydata/xarray/pull/2048 were intended to address cases where the concat_dim was not yet present in the input files. But in cases where concat_dim is already in the input files, it seems like only those fields that include this dimensions should be concatenated and other fields should remain free of concat_dim. Part of the problem for us is that the number of dimensions of some of our variables change depending on which xarray version is being used.

Expected Output

That for 0.10.2 (see above)

Output of xr.show_versions()

/home/xylar/miniconda2/envs/mpas_analysis_py3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Linux OS-release: 4.13.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.3 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.1 netCDF4: 1.3.1 h5netcdf: 0.5.1 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.2 distributed: 1.21.6 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: None setuptools: 39.0.1 pip: 9.0.3 conda: None pytest: 3.5.0 IPython: None sphinx: 1.7.2
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2064/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
525903846 MDU6SXNzdWU1MjU5MDM4NDY= 3552 Dataset.drop() no longer works with a set as an argument xylar 4179064 closed 0     15 2019-11-20T16:08:18Z 2020-01-14T16:13:24Z 2020-01-14T16:13:24Z NONE      

MCVE Code Sample

```python

Your code here

!/usr/bin/env python

import xarray import numpy

ds = xarray.Dataset() ds['x'] = ('x', numpy.linspace(0., 1., 100)) ds['y'] = ('y', numpy.linspace(0., 1., 50)) print(ds) ds = ds.drop(set('x')) print(ds) ```

Expected Output

<xarray.Dataset> Dimensions: (x: 100, y: 50) Coordinates: * x (x) float64 0.0 0.0101 0.0202 0.0303 ... 0.9697 0.9798 0.9899 1.0 * y (y) float64 0.0 0.02041 0.04082 0.06122 ... 0.9592 0.9796 1.0 Data variables: *empty* <xarray.Dataset> Dimensions: (y: 50) Coordinates: * y (y) float64 0.0 0.02041 0.04082 0.06122 ... 0.9592 0.9796 1.0 Data variables: *empty*

Problem Description

In versions before xarray 0.14.1, the code above involving calls to Dataset.drop() with a set as an argument worked as expected. With the new release, the result is an error as shown below. This is breaking backwards compatibility with our software (MPAS-Analysis).

<xarray.Dataset> Dimensions: (x: 100, y: 50) Coordinates: * x (x) float64 0.0 0.0101 0.0202 0.0303 ... 0.9697 0.9798 0.9899 1.0 * y (y) float64 0.0 0.02041 0.04082 0.06122 ... 0.9592 0.9796 1.0 Data variables: *empty* Traceback (most recent call last): File "./drop_issue.py", line 10, in <module> ds = ds.drop(set('x')) File "/home/xylar/miniconda3/envs/test/lib/python3.7/site-packages/xarray/core/dataset.py", line 3643, in drop return self.drop_sel(labels, errors=errors) File "/home/xylar/miniconda3/envs/test/lib/python3.7/site-packages/xarray/core/dataset.py", line 3689, in drop_sel labels = either_dict_or_kwargs(labels, labels_kwargs, "drop") File "/home/xylar/miniconda3/envs/test/lib/python3.7/site-packages/xarray/core/utils.py", line 257, in either_dict_or_kwargs "the first argument to .%s must be a dictionary" % func_name ValueError: the first argument to .drop must be a dictionary

Output of xr.show_versions()

# Paste the output here xr.show_versions() here INSTALLED VERSIONS ------------------ commit: None python: 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 21:52:21) [GCC 7.3.0] python-bits: 64 OS: Linux OS-release: 4.15.0-1063-oem machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 libhdf5: 1.10.5 libnetcdf: 4.7.1 xarray: 0.14.1 pandas: 0.25.3 numpy: 1.17.3 scipy: 1.3.2 netCDF4: 1.5.3 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.0.4.2 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.1 dask: 2.8.0 distributed: 2.8.0 matplotlib: 3.1.2 cartopy: 0.17.0 seaborn: None numbagg: None setuptools: 41.6.0.post20191101 pip: 19.3.1 conda: None pytest: 5.3.0 IPython: None sphinx: None
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3552/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
399457300 MDU6SXNzdWUzOTk0NTczMDA= 2680 Error importing xarray 0.11.2 in python 2.7 xylar 4179064 closed 0     1 2019-01-15T17:37:56Z 2019-01-15T19:38:05Z 2019-01-15T19:38:05Z NONE      

I'm creating a fresh conda environment that's pretty minimal: bash conda create -y -n test_env python=2.7 xarray numpy dask conda activate test_env python -c "import xarray" I see the following error message:

``` Traceback (most recent call last): File "<string>", line 1, in <module> File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/xarray/__init__.py", line 10, in <module> from .core.alignment import align, broadcast, broadcast_arrays File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/xarray/core/alignment.py", line 10, in <module> from . import utils File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/xarray/core/utils.py", line 16, in <module> from .pycompat import ( File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/xarray/core/pycompat.py", line 73, in <module> import dask.array File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/dask/array/__init__.py", line 9, in <module> from .routines import (take, choose, argwhere, where, coarsen, insert, File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/site-packages/dask/array/routines.py", line 256, in <module> @wraps(np.matmul) File "/home/xylar/miniconda3/envs/test_env4/lib/python2.7/functools.py", line 33, in update_wrapper setattr(wrapper, attr, getattr(wrapped, attr)) AttributeError: 'numpy.ufunc' object has no attribute '__module__' ```

If I omit dask, I don't see this error. I also haven't had any trouble before this week so it's possible that this is caused indirectly by another library having been updated.

Expected Output

Nothing at all.

Details of conda environment

Details from conda create

``` conda create -n test_env4 python=2.7 xarray numpy dask Solving environment: done ## Package Plan ## environment location: /home/xylar/miniconda3/envs/test_env4 added / updated specs: - dask - numpy - python=2.7 - xarray The following NEW packages will be INSTALLED: backports_abc: 0.5-py_1 conda-forge blas: 1.1-openblas conda-forge bokeh: 1.0.4-py27_1000 conda-forge ca-certificates: 2018.11.29-ha4d7672_0 conda-forge certifi: 2018.11.29-py27_1000 conda-forge click: 7.0-py_0 conda-forge cloudpickle: 0.6.1-py_0 conda-forge cytoolz: 0.9.0.1-py27h14c3975_1001 conda-forge dask: 1.0.0-py_0 conda-forge dask-core: 1.0.0-py_0 conda-forge distributed: 1.25.2-py27_1000 conda-forge freetype: 2.9.1-h3cfcefd_1004 conda-forge futures: 3.2.0-py27_1000 conda-forge heapdict: 1.0.0-py27_1000 conda-forge jinja2: 2.10-py_1 conda-forge jpeg: 9c-h14c3975_1001 conda-forge libffi: 3.2.1-hf484d3e_1005 conda-forge libgcc-ng: 7.3.0-hdf63c60_0 conda-forge libgfortran-ng: 7.2.0-hdf63c60_3 conda-forge libpng: 1.6.36-h84994c4_1000 conda-forge libstdcxx-ng: 7.3.0-hdf63c60_0 conda-forge libtiff: 4.0.10-h648cc4a_1001 conda-forge locket: 0.2.0-py_2 conda-forge markupsafe: 1.1.0-py27h14c3975_1000 conda-forge msgpack-python: 0.6.0-py27h6bb024c_1000 conda-forge ncurses: 6.1-hf484d3e_1002 conda-forge numpy: 1.16.0-py27_blas_openblash1522bff_1000 conda-forge [blas_openblas] olefile: 0.46-py_0 conda-forge openblas: 0.3.3-h9ac9557_1001 conda-forge openssl: 1.0.2p-h14c3975_1002 conda-forge packaging: 18.0-py_0 conda-forge pandas: 0.24.0rc1-py27hf484d3e_0 conda-forge partd: 0.3.9-py_0 conda-forge pillow: 5.4.1-py27h00a061d_1000 conda-forge pip: 18.1-py27_1000 conda-forge psutil: 5.4.8-py27h14c3975_1000 conda-forge pyparsing: 2.3.1-py_0 conda-forge python: 2.7.15-h938d71a_1006 conda-forge python-dateutil: 2.7.5-py_0 conda-forge pytz: 2018.9-py_0 conda-forge pyyaml: 3.13-py27h14c3975_1001 conda-forge readline: 7.0-hf8c457e_1001 conda-forge setuptools: 40.6.3-py27_0 conda-forge singledispatch: 3.4.0.3-py27_1000 conda-forge six: 1.12.0-py27_1000 conda-forge sortedcontainers: 2.1.0-py_0 conda-forge sqlite: 3.26.0-h67949de_1000 conda-forge tblib: 1.3.2-py_1 conda-forge tk: 8.6.9-h84994c4_1000 conda-forge toolz: 0.9.0-py_1 conda-forge tornado: 5.1.1-py27h14c3975_1000 conda-forge wheel: 0.32.3-py27_0 conda-forge xarray: 0.11.2-py27_1000 conda-forge xz: 5.2.4-h14c3975_1001 conda-forge yaml: 0.1.7-h14c3975_1001 conda-forge zict: 0.1.3-py_0 conda-forge zlib: 1.2.11-h14c3975_1004 conda-forge Proceed ([y]/n)? y Preparing transaction: done Verifying transaction: done Executing transaction: done # # To activate this environment, use # # $ conda activate test_env4 # # To deactivate an active environment, use # # $ conda deactivate ```

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2680/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 3441.303ms · About: xarray-datasette