home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

4 rows where user = 1191149 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 2
  • pull 2

state 1

  • closed 4

repo 1

  • xarray 4
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
1087160635 I_kwDOAMm_X85AzME7 6103 reindex multidimensional fill_value skipping barronh 1191149 closed 0     1 2021-12-22T20:14:00Z 2021-12-22T20:17:39Z 2021-12-22T20:17:38Z CONTRIBUTOR      

What happened:

I started with a Dataframe that represented an identity matrix and used reindex with multiple dimensions and a fill_value. The goal was to produce a Dataset from a sparse dataframe and the identity matrix was the simplest example. The fill_value in reindex is defined as "Value to use for newly missing values." What I found was that fill_values were not applied at coordinates that were in the unique set of the any coord. For a pure identity matrix, that means fill_values were not applied anywhere (all rows are present, all cols are present). When I thin the identity matrix (skipping elements), the error is more obvious. On the rows and columns that have valid input data, the fill_value is not applied.

What you expected to happen:

I expected all new nan values to be filled with the fill value.

Minimal Complete Verifiable Example:

``` import numpy as np import pandas as pd

n = 10 thin = 2

df = pd.DataFrame.from_dict([ dict(ROW=v, COL=v, LAND=1) for v in np.arange(0, n, thin) ]).set_index(['ROW', 'COL'])

ds = df.to_xarray() rds = ds.reindex(ROW=np.arange(n), COL=np.arange(n), fill_value=0)

p = rds.LAND.plot()

p.axes.set_facecolor('red')

p.axes.figure.savefig('test.png')

print(rds.LAND[:]) print(rds.LAND[::thin, ::thin]) ```

Output: <xarray.DataArray 'LAND' (ROW: 10, COL: 10)> array([[ 1., 0., nan, 0., nan, 0., nan, 0., nan, 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [nan, 0., 1., 0., nan, 0., nan, 0., nan, 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [nan, 0., nan, 0., 1., 0., nan, 0., nan, 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [nan, 0., nan, 0., nan, 0., 1., 0., nan, 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [nan, 0., nan, 0., nan, 0., nan, 0., 1., 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]]) Coordinates: * ROW (ROW) int64 0 1 2 3 4 5 6 7 8 9 * COL (COL) int64 0 1 2 3 4 5 6 7 8 9 <xarray.DataArray 'LAND' (ROW: 5, COL: 5)> array([[ 1., nan, nan, nan, nan], [nan, 1., nan, nan, nan], [nan, nan, 1., nan, nan], [nan, nan, nan, 1., nan], [nan, nan, nan, nan, 1.]]) Coordinates: * ROW (ROW) int64 0 2 4 6 8 * COL (COL) int64 0 2 4 6 8

Anything else we need to know?:

Environment:

Output of <tt>xr.show_versions()</tt> INSTALLED VERSIONS ------------------ commit: None python: 3.7.12 (default, Sep 10 2021, 00:21:48) [GCC 7.5.0] python-bits: 64 OS: Linux OS-release: 5.4.144+ machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: ('en_US', 'UTF-8') libhdf5: 1.12.0 libnetcdf: 4.7.4 xarray: 0.18.2 pandas: 1.1.5 numpy: 1.19.5 scipy: 1.4.1 netCDF4: 1.5.8 pydap: None h5netcdf: None h5py: 3.1.0 Nio: None zarr: None cftime: 1.5.1.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.2 dask: 2.12.0 distributed: 1.25.3 matplotlib: 3.2.2 cartopy: None seaborn: 0.11.2 numbagg: None pint: None setuptools: 57.4.0 pip: 21.1.3 conda: None pytest: 3.6.4 IPython: 5.5.0 sphinx: 1.8.6
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6103/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
517522619 MDExOlB1bGxSZXF1ZXN0MzM2NTc1NzYx 3485 uamiv test using only raw uamiv variables barronh 1191149 closed 0     2 2019-11-05T03:07:14Z 2019-11-05T15:42:36Z 2019-11-05T15:42:35Z CONTRIBUTOR   0 pydata/xarray/pulls/3485

Previous test relied on CF generated metadata, but this test is more robust.

  • [x] Addresses #3409 #3420 #3434
  • [x] Tests fixed

Development machine is missing python3.6 environment, using azure to check. This should help me get this fix in faster.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3485/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
296561316 MDExOlB1bGxSZXF1ZXN0MTY4NzE1OTQy 1905 Added PNC backend to xarray barronh 1191149 closed 0     34 2018-02-12T23:26:36Z 2018-06-01T11:40:58Z 2018-06-01T04:21:44Z CONTRIBUTOR   0 pydata/xarray/pulls/1905

PNC is used for GEOS-Chem, CAMx, CMAQ and other atmospheric data formats including NOAA arl packed bit and HYSPLIT related files. The goal is to provide access to many file types that have their own file formats and meta-data conventions. It can provide a CF compliant netCDF-like interface.

  • [x] Tests were run using example datasets, but datasets were not added to repository as they are large and diverse. Only files that provide CF compliant structures will successfully open. Confirmed by @bbakernoaa
  • [x] Tests added (for all bug fixes or enhancements)
  • [x] Tests passed (for all non-documentation changes)
  • [x] Fully documented, including whats-new.rst for all changes and api.rst for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1905/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
309592370 MDU6SXNzdWUzMDk1OTIzNzA= 2025 dask ImportWarning causes pytest failure barronh 1191149 closed 0     2 2018-03-29T02:01:50Z 2018-03-30T03:44:55Z 2018-03-30T03:44:55Z CONTRIBUTOR      

Code Sample, a copy-pastable example if possible

```bash

Your code here

pytest -v xarray/tests/test_backends.py::NetCDF4DataTest ```

Problem description

Instead of passing all tests, this fails on the test_88_character_filename_segmentation_fault function.

E AssertionError: exception ImportWarning("can't resolve package from __spec__ or __package__, falling back on __name__ and __path__",) did not match pattern 'segmentation fault'

It oddly doesn't seem to be related to the 88 character issue. I traced this down to xarray/backends/common.py::get_scheduler where an "ImportWarning" is not being caught on line 49.

Expected Output

Passes all tests. Adding ImportWarning to the tuple of exceptions in get_scheduler fixes the problem.

Output of xr.show_versions()

# Paste the output here xr.show_versions() here >>> xr.show_versions() /Users/barronh/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`. from ._conv import register_converters as _register_converters INSTALLED VERSIONS ------------------ commit: None python: 3.6.4.final.0 python-bits: 64 OS: Darwin OS-release: 17.4.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8 xarray: 0.10.2 pandas: 0.22.0 numpy: 1.14.2 scipy: 1.0.0 netCDF4: 1.3.1 h5netcdf: None h5py: 2.7.1 Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: 0.17.1 distributed: 1.21.3 matplotlib: 2.2.2 cartopy: None seaborn: 0.8.1 setuptools: 38.4.0 pip: 9.0.1 conda: 4.5.0 pytest: 3.4.2 IPython: 6.2.1 sphinx: 1.6.6
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2025/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 20.854ms · About: xarray-datasette