home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

1 row where state = "open" and user = 4179064 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

type 1

  • issue 1

state 1

  • open · 1 ✖

repo 1

  • xarray 1
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
314764258 MDU6SXNzdWUzMTQ3NjQyNTg= 2064 concat_dim getting added to *all* variables of multifile datasets xylar 4179064 open 0     31 2018-04-16T18:16:25Z 2022-07-16T14:11:42Z   NONE      

Code Sample

Using the following example data set: example_jan.nc ```python

!/usr/bin/env python3

import xarray ds = xarray.open_mfdataset('example_jan.nc', concat_dim='Time') print(ds) The result from xarray 0.10.2 (and all previous various xarray versions we've worked with): Dimensions: (Time: 1, nOceanRegions: 7, nOceanRegionsTmp: 7, nVertLevels: 100) Dimensions without coordinates: Time, nOceanRegions, nOceanRegionsTmp, nVertLevels Data variables: time_avg_avgValueWithinOceanLayerRegion_avgLayerTemperature (Time, nOceanRegionsTmp, nVertLevels) float64 dask.array<shape=(1, 7, 100), chunksize=(1, 7, 100)> time_avg_avgValueWithinOceanRegion_avgSurfaceTemperature (Time, nOceanRegions) float64 dask.array<shape=(1, 7), chunksize=(1, 7)> time_avg_daysSinceStartOfSim (Time) timedelta64[ns] dask.array<shape=(1,), chunksize=(1,)> xtime_end (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> xtime_start (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> refBottomDepth (nVertLevels) float64 dask.array<shape=(100,), chunksize=(100,)> Attributes: history: Tue Dec 6 04:49:14 2016: ncatted -O -a ,global,d,, acme_alaph7... NCO: "4.6.2" The results with xarray 0.10.3: <xarray.Dataset> Dimensions: (Time: 1, nOceanRegions: 7, nOceanRegionsTmp: 7, nVertLevels: 100) Dimensions without coordinates: Time, nOceanRegions, nOceanRegionsTmp, nVertLevels Data variables: time_avg_avgValueWithinOceanLayerRegion_avgLayerTemperature (Time, nOceanRegionsTmp, nVertLevels) float64 dask.array<shape=(1, 7, 100), chunksize=(1, 7, 100)> time_avg_avgValueWithinOceanRegion_avgSurfaceTemperature (Time, nOceanRegions) float64 dask.array<shape=(1, 7), chunksize=(1, 7)> time_avg_daysSinceStartOfSim (Time) timedelta64[ns] dask.array<shape=(1,), chunksize=(1,)> xtime_end (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> xtime_start (Time) |S64 dask.array<shape=(1,), chunksize=(1,)> refBottomDepth (Time, nVertLevels) float64 dask.array<shape=(1, 100), chunksize=(1, 100)> Attributes: history: Tue Dec 6 04:49:14 2016: ncatted -O -a ,global,d,, acme_alaph7... NCO: "4.6.2" ```

Problem description

The expected behavior for us was that refBottomDepth should not have Time as a dimension. It does not vary with time and does not have a Time dimension in the input data set.

It seems like https://github.com/pydata/xarray/issues/1988 and https://github.com/pydata/xarray/pull/2048 were intended to address cases where the concat_dim was not yet present in the input files. But in cases where concat_dim is already in the input files, it seems like only those fields that include this dimensions should be concatenated and other fields should remain free of concat_dim. Part of the problem for us is that the number of dimensions of some of our variables change depending on which xarray version is being used.

Expected Output

That for 0.10.2 (see above)

Output of xr.show_versions()

/home/xylar/miniconda2/envs/mpas_analysis_py3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters INSTALLED VERSIONS ------------------ commit: None python: 3.6.5.final.0 python-bits: 64 OS: Linux OS-release: 4.13.0-38-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.3 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.1 netCDF4: 1.3.1 h5netcdf: 0.5.1 h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.2 distributed: 1.21.6 matplotlib: 2.2.2 cartopy: 0.16.0 seaborn: None setuptools: 39.0.1 pip: 9.0.3 conda: None pytest: 3.5.0 IPython: None sphinx: 1.7.2
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2064/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 25.702ms · About: xarray-datasette